As both members of Congress and federal law enforcement investigate the cause and execution of the Jan. 6 riot at the U.S. Capitol, the role social media played in the riot has emerged as a key issue.
The House Select Committee investigating the mob attack has asked a wide range of social media and communications companies to preserve records on hundreds of people, including members of Congress, who may be involved in the investigation. Beyond these specific requests, the Commission will examine how false claims about the 2020 election are made on platforms such as Facebook and Twitter, including how algorithms contribute to disinformation and the promotion of extremism. There is a broader interest in how it spreads to other countries. Meanwhile, federal prosecutors, who are investigating more than 600 criminal cases, are investigating social media accounts used to organize efforts by Trump supporters to stop Congress from certifying President Joe Biden’s victory. We rely on the evidence we have collected.
A recent report we released through the Center for Business and Human Rights at New York University’s Stern School of Management finds that technology platforms are linked to extreme polarization that can lead to the erosion of democratic values ​​and sectarian violence. It sheds light on relationships. Facebook, the largest social media platform, has gone out of its way to deny that it is contributing to extreme divisions, but a growing body of social science research, as well as Facebook’s own actions and leaked documents, suggest that a significant link exists. It is shown that.
Our central conclusion, based on a review of more than 50 social science studies and interviews with more than 40 academics, policy experts, activists, and current and former industry officials, is that Facebook, YouTube, Twitter, and more This means that the platform is most likely not the root cause. It causes political polarization, but it can also make it worse. It is important to clarify this point for two reasons. First, Facebook’s denials in Congressional testimony and other public statements may have obscured the issue in the minds of lawmakers and the public. Second, at a time when the country is trying to make sense of what happened on January 6th while simultaneously turning its attention to elections in 2022, 2024, and beyond, popular technology platforms are Understanding the pernicious role it can play in politics should be an urgent priority.
Social media fosters partisan hostility
Facebook’s Mark Zuckerberg has repeatedly dismissed suggestions that his company is stirring up discord. “Some say the problem is that social networks are polarizing us, but that is simply not clear from the evidence or research,” she testified before a U.S. House of Representatives subcommittee in March 2021. Instead, he pointed to the “political and media environment.” That’s what drives Americans away. A few days later, Nick Clegg, Facebook’s vice president of global affairs and communications, said, “No matter what evidence there is, social media, or the filter bubbles it supposedly creates, polarize the opinions of many people.” “This does not support the idea that there is a clear factor contributing to this.” claim. “
However, contrary to Facebook’s claims, various experts have concluded that social media use is contributing to partisan hostility in the United States. In an article published in Science in October 2020, a group of 15 researchers summarized the academic consensus: “In recent years, social media companies such as Facebook and Twitter have exerted influence over political discourse, leading to an intensification of political sectarianism.” In August 2021, another quartet of researchers published an article in the journal Trends Summarizing their review of the empirical evidence in an article titled “In Cognitive Science,” they concluded that “while it is unlikely that social media is a major factor in polarization,” many becomes an important facilitator. ”
Partisan relations are complex, but platforms cannot escape responsibility entirely.
Polarization is a complex phenomenon. Some degree of division is natural in a democracy. In the United States, the struggle for social and racial justice has led to backlash and partisan hostility. But the extreme polarization we are currently witnessing, particularly on the political right, has consequences that threaten to undermine democracy itself. These include declining trust in institutions; contempt for facts. Legislative dysfunction. Erosion of democratic norms. And worst of all, real-world violence.
Of course, all of this cannot be blamed on the rise of Silicon Valley. Polarization began to increase in the United States decades before the advent of Facebook, Twitter, and YouTube. Other factors are also contributing to the problem, including the realignment of political parties, the rise of bipartisan radio and cable TV stations, and the rise in racial hostility during Donald Trump’s uniquely divisive presidency. .
But that doesn’t make the tech platform innocent, as Facebook would have us believe. A study published in March 2020 describes an experiment in which participants stopped using Facebook for a month and then surveyed their opinions. The researchers found that platform non-participation “significantly reduced polarization of opinion on policy issues,” but did not reduce divisions based strictly on party identity. “This is consistent with the idea that when people see political content on social media, they tend to become more upset and angrier towards the other person. [and more likely] We need to have stronger views on certain issues,” Matthew Gentzkow, an economist at Stanford University and co-author of the study, said in an interview.
Facebook and others point to other studies that raise questions about the relationship between social media and polarization. A 2017 study found that from 1996 to 2016, Americans 65 and older experienced the sharpest increase in polarization and were the group least likely to use social media. A 2020 paper compared the rise in levels of polarization in the United States over four decades to that of eight other advanced democracies. In other countries, increases in division were smaller or polarization decreased.These variations across countries suggest that factors other than social media are driving America’s polarization in the long run.
However, it is interesting to note that both the comparisons by age group and the comparisons between countries span several decades, including a long period before the advent of social media. Therefore, a more recent snapshot of the United States is more relevant. The paper, published in March, was based on a study of more than 17,000 Americans and found that Facebook’s content ranking algorithm increases users’ exposure to news organizations that offer viewpoints opposite to their own. It was found that there is a possibility that this could lead to further polarization.
Maximizing online engagement increases polarization
The basic design of platform algorithms helps explain why they amplify divisive content. “Social media technologies employ popularity-based algorithms that adjust content to maximize user engagement,” the Science co-authors wrote. Maximizing engagement increases polarization, especially within networks of like-minded users. This is “in part due to the contagious nature of content that inspires sectarian fear and resentment,” the researchers said.
As we wrote in our report, “Social media companies aren’t trying to increase user engagement because they want to strengthen polarization. Because the time you spend clicking, sharing, and retweeting is also the time you spend watching the paid ads that make major platforms so profitable. ”
Facebook is well aware of how its automated systems foster discord. The company conducts extensive internal research on polarization issues and regularly adjusts its algorithms to reduce the flow of content that may incite political extremism or hatred. However, we typically only reduce the level of inflammatory content for a limited period of time. Making adjustments permanent will hurt user engagement. Examples include the tumultuous period immediately following the November 2020 election and the days before the April 2021 verdict in the Derek Chauvin trial.
It’s time for the government to intervene
In a series of investigative articles published the same week as the New York University report, the Wall Street Journal cited internal Facebook documents in which company researchers repeatedly pointed to the negative effects of the company’s platform. However, it has been revealed that top management is rejecting the reform plan. In one episode, a major algorithm change in 2018 backfired, unintentionally increasing anger and division on the platform, according to Facebook’s own internal research. However, Zuckerberg reportedly resisted some of the proposed fixes, fearing they could hurt user engagement.
It is clear that Facebook and its social media peers need to move beyond denial and recognize their role in increasing polarization. One thing the industry can start doing is making adjustments to algorithms that lower temperatures more permanent rather than temporary. In doing so, technology platforms will need to continually improve their automated systems and content moderation policies to prevent the removal of legitimate political expression. This is certainly a difficult challenge, but one that we have set ourselves by building such a vast and pervasive network. Another step social media companies should take is to make public how their currently secret algorithms rank, recommend and remove content. Increased transparency puts lawmakers, regulators, academics, and the general public in a stronger position to assess how platforms work and demand accountability where warranted. It will be. Unfortunately, Facebook has recently moved in the opposite direction, in one case firing New York University researchers who were studying whether the platform was being used to sow distrust in elections. The company accused the New York University team of improperly gathering the information, an accusation the researchers denied.
It would be nice if social media companies policed ​​themselves, but they aren’t doing enough. As a result, governments need to step in and provide ongoing oversight that has been lacking so far. In our report, Congress requires the Federal Trade Commission to go beyond transparency and define tech companies’ obligations when addressing hateful, extremist, or threatening content on social media conduct. It proposes to be empowered to draft and enforce codes.
For example, this standard could set benchmarks for various categories of harmful content that remain on platforms even after automation and human moderation. If the standards are exceeded, fines may be imposed. Congress could require social media companies to include new rules in their terms of service with users. If companies then fail to comply with the standards, the FTC could use existing authority to initiate enforcement actions to crack down on “unfair or deceptive” business practices. Rep. Jan Schakowsky (D-Ill.) and Rep. Kathy Castor (D-Fla.) have introduced legislation that outlines the direction we recommend.
Widespread use of social media has fueled extreme polarization, leading to a decline in trust in democratic values. election. and even scientific facts such as the need for vaccination in the face of a deadly pandemic. If we fail to recognize and counter these developments, we risk a repeat of what happened at the U.S. Capitol on January 6th, or worse.
Barrett is a senior fellow at New York University’s Stern School of Management and associate director of the Center for Business and Human Rights, where Sims is a fellow. Hendricks is an associate research fellow and adjunct lecturer at New York University’s Tandon School of Engineering and founder and editor of Tech Policy Press.
Facebook and Google are general unrestricted donors to the Brookings Institution. The findings, interpretations, and conclusions presented in this article are solely those of the authors and were not influenced by donations.