top of page
Staff

New Studies Reveal User Behavior, Not Algorithms, Amplify Political Polarization on Facebook, IG

Research papers unveiled on Thursday indicate that Facebook and Instagram's algorithms aren't solely responsible for the political polarization seen on these platforms, contrary to prior assumptions.


The findings, taken in totality, suggest that Facebook users tend to gravitate towards content that resonates with their existing views. The emergence of "echo-chambers" further enables political factions to depend on and consume information and misinformation from differing sources.


This research, featured in Science and Nature, leveraged unparalleled access to data from Facebook and Instagram, specifically around the 2020 election period. The researchers altered various parts of the sites’ algorithms to examine its effects on users' political ideologies and polarization.


Though algorithms have often been criticized for their role in shaping politics on these platforms, and have even prompted calls for regulation, the studies revealed that they don't significantly contribute to polarization.


Talia Jomini Stroud, a professor at the University of Texas and a lead researcher, stated that while algorithms greatly influence on-platform experiences and there is a noticeable ideological segregation in political news exposure, the common proposals for altering social media algorithms didn't have a substantial impact on political attitudes.


Social media algorithms typically curate content for users based on their perceived interests. This has led to worries about a cycle of disinformation, where users are progressively presented with misleading information to reinforce their political views.


Regulation disputes surrounding Facebook's algorithms and its payments to content creators have led to news feeds being blocked in Canada, and a similar situation is brewing in California.


However, the research found that disabling the algorithms' suggestive features and displaying content chronologically didn't reduce polarization. Furthermore, polarization remained relatively unchanged when content sharing was disabled on users' feeds, though the spread of misinformation fell drastically.


Reducing the exposure to politically-aligned content in users' feeds didn't significantly influence polarization or political opinion, according to one study. Generally, platform usage decreased across all studies.


David Lazer, a Northwestern University Professor involved in all four studies, noted that the algorithm merely serves what users prefer, thus facilitating what they're already inclined to do.

Meta, the parent company of Facebook, praised the research, maintaining that it disproves allegations of their algorithms being harmful.


However, critics argue that the studies are limited since the data used was carefully selected by Meta. Nora Benavidez, a senior counsel to the nonprofit Free Press, warned against using these studies to absolve Meta from its role in escalating political polarization and violence.


The research further illuminated how users of varying political ideologies behave on these platforms. For instance, conservatives were found to be the most likely group to read and share misinformation and they also had the broadest range of sources that cater to their beliefs. Misinformation-spreading sites were found to be more popular among conservatives, representing about 97% of such platforms.


Despite restrictions imposed by Meta on the data, mostly citing user privacy, Lazer termed them as reasonable, and promised more findings in the future. He highlighted the uniqueness of this study, noting the restricted research previously available on this topic.

bottom of page