Social Media’s Influence on Knowledge, Beliefs, and Behaviors
New Studies Explore the Complex Relationship
Introduction
Recently, four groundbreaking academic papers were published simultaneously in two prestigious journals, Nature and Science. These papers, authored by top-tier researchers from renowned universities in the United States, delve into the pressing public-policy issue of our time: the impact of social media on our knowledge, beliefs, and behaviors. Using data collected from millions of Facebook users over several months, the studies reveal the platform’s significant influence on information consumption, time spent online, and awareness of news events. Furthermore, Facebook’s algorithms tend to present users with content that aligns with their existing beliefs, resulting in the creation of political “filter bubbles” and the spread of misinformation among politically conservative users.
The Role of Social Media in Promoting Misinformation and Polarization
While the studies highlight the harmful effects of social media in specific events, such as the dissemination of rumors leading to violence in Sri Lanka and the organization of mob attacks in Brazil and the United States, they also explore broader concerns about the spread of misinformation and polarization. The prevailing theory suggests that social media, with its algorithms tailored to user interests and biases, intensifies the “filter bubble” phenomenon. This information bubble exposes users to increasingly skewed versions of reality, eroding consensus and hindering understanding between opposing sides. This theory gained widespread attention after the election of Donald Trump in 2016, with publications like New York Magazine and Wired Magazine attributing his victory and the erosion of democracy to the existence of filter bubbles.
Testing the Filter Bubble Effect
Despite these claims, a rigorous examination of the filter bubble effect was lacking until now. The four new studies, part of a series of 16 peer-reviewed papers resulting from a collaboration between Meta (the parent company of Facebook and Instagram) and researchers from esteemed institutions like Princeton, Dartmouth, and Stanford, shed light on this complex issue. Meta provided researchers with unprecedented access to data from over 200 million users and allowed them to conduct randomized controlled experiments. Notably, the research collaboration involved the nonpartisan research organization NORC at the University of Chicago, with Meta spending $20 million on the project. While Meta did not directly fund the researchers, some authors had received previous funding from the company. Measures were taken to ensure the independence of the research, such as preregistering research questions and respecting users’ privacy.
Findings
The collective findings of these studies confirm the first part of the filter bubble theory. Facebook users do tend to see posts from like-minded sources, resulting in high levels of “ideological segregation” and limited overlap between the content consumed by liberal and conservative users. Misinformation is concentrated in a conservative section of the platform, making right-wing users more likely to encounter political falsehoods.
However, the studies find little support for the second part of the theory that filtered content significantly shapes people’s beliefs and worldviews. Experimental manipulations that reduced exposure to like-minded sources or removed algorithmic influence from users’ feeds had no discernible impact on polarization, political attitudes, or factual knowledge. Even removing content shared by other users made no significant difference.
The Implications
These findings challenge not only assumptions about social media but also core beliefs about how individuals form their beliefs and political views. Researchers note a looser link between information consumption and beliefs than previously understood, with new studies suggesting an even weaker relationship. If presenting people with new information fails to change their beliefs or political support, it has implications that extend beyond journalism. It affects voters’ perception of the world and their ability to hold democratic leaders accountable.
Thank you for being a subscriber.
If you’d like to read past editions of this newsletter or browse other subscriber-only newsletters, please visit our website.
We appreciate your feedback. Please email any thoughts and suggestions to interpreter@nytimes.com or follow me on Twitter.