The way in which we access and respond to current news on social media may inadvertently isolate us and place us in social groups with like-minded people. According to a study led by researchers at Princeton University, this leads to polarization of user networks and may add to the politically divisive environment currently prevailing in the U.S.
The team of scientists that included Christopher Tokita, a PhD student at Princeton, developed a model of complex contagion, and applied it to the way in which reaction to news coverage may spread and lead to online polarization. Complex contagion is the phenomenon seen on social media whereby users need multiple exposures to a new idea or innovation before they change their own behavior.
The model also makes use of the concept of “information cascades,” or the process whereby individuals observe and mimic the actions of others, thereby causing a wide online shift in behavior. This phenomenon is not unlike the collective behavior seen in schools of fish or insect swarms.
Working with Andy Guess, assistant professor of politics and public affairs at the Princeton School of Public and International Affairs, and Corina Tarnita, professor of ecology and evolutionary biology with the Princeton Department of Ecology and Evolutionary Biology, Tokita then tested the model’s predictions using data from real social networks on Twitter.
They examined 1,000 followers of each of four news outlets: CBS News, USA Today, Vox, and the Washington Examiner. CBC News and USA Today are mainstream news outlets known for fact-based reporting, while Vox and the Washington Examiner provide more slanted and agenda-based coverage. Over a six-week period in summer 2020, the researchers recorded who followed and unfollowed whom in order to track political and ideological shifts among social networks.
The findings showed that followers of CBS News and USA Today represented a more ideologically diverse group than followers of the other two media outlets. In addition, the followers of Vox and the Washington Examiner tended to lose political and ideological diversity among their own online connections faster than users who followed CBS News and USA Today. These trends indicate the potential for networks to become politically polarized by virtue of their online activities.
The researchers found that when people are less reactive to news, they continue to receive news feeds from a diversity of sources and their online environment therefore remains politically mixed. However, when users constantly react to and share articles from their preferred news sources, they are more likely to end up in a network of like-minded users and become politically isolated. The researchers refer to these homogeneous but isolated groups as “epistemic bubbles.”
The model shows that, once such a network group is formed, users in these bubbles tend to miss out on news articles, not only from other media outlets, but also from their own, preferred sources. This is because they seem to avoid what they deem “unimportant” news at the expense of missing out on subjectively important news.
“Our study shows that, even without social media algorithms, coverage from polarized news outlets is changing users’ social connections and pushing them unknowingly into so-called political ‘echo chambers,’ where they are surrounded by others who share their same political identity and beliefs,” said Tokita, who is now a data scientist at cybersecurity startup ‘Phylum’. “Whether a user chooses to react to or ignore certain news posts can help determine if their social network will become ideologically homogenous or remain more diverse.”
Once a user has unwittingly ended up in a network of like-minded people, he or she can deepen the extent of the polarization by the action of ‘curating’ or managing online relationships. For example, when viral news stories are shared, users can conclude that some of the “friends” they follow on social media are misrepresenting the news as reported by their own preferred outlets. When users subsequently decide to “unfollow” untrustworthy connections they unintentionally sort themselves into polarized networks.
“It’s not hard to find evidence of polarized discourse on social media, but we know less about the mechanisms of how social media can drive people apart. Our contribution is to show that polarization of online social networks emerges naturally as people curate their feeds. Counterintuitively, this can occur even without knowing other users’ partisan identities,” Guess said.
Although it is likely that the divisions currently evident in American politics are due to many factors, online interactions have doubtless contributed by influencing human behavior and relationships. The study’s results show that blatant knowledge of political ideology or alignment is not necessary for social networks to become politically segregated for users.
The research team advocates for further investigation into how these trends may contribute to the spread and consumption of “fake news” and misinformation, and how inaccurate news fuels political division among members of the public. For example, the study suggests that people who consume and share fake news might be inadvertently isolating themselves from everyone else who follows mainstream sources.
“Though derived from a simple theoretical model of collective dynamics, our results demonstrate the power of a cross-disciplinary approach to the study of political polarization. We hope that they may inspire future examinations into social network-specific algorithms and patterns as potential contributors to societal polarization,” Tarnita said.
The results of the study are published in the journal Proceedings of the National Academy of Sciences.