case preview
Written by Amy Howe
February 23, 2024
4:14 p.m.
Oral arguments in both cases begin Monday at 10 a.m. ET. (His Trekandshoot via Shutterstock)
The relationship between government and social media will once again be the centerpiece of Monday’s Supreme Court arguments. NetChoice v. Paxton and Moody vs. Net Choice These are only the second of three social media disputes the court will hear this term. Justices on Monday ruled on the constitutionality of controversial Texas and Florida laws regulating how major social media companies like Facebook and X (formerly Twitter) manage content posted on their sites. I am planning to consider gender.
Texas and Florida defended their laws, characterizing them simply as efforts to “address discrimination by social media platforms.” But technology groups challenging the law argue that it is an “unusual assertion of government power over speech and violates the First Amendment in many ways.” There is.
Legislatures in Texas and Florida passed this law in 2021 in response to the idea that social media companies were censoring users, especially those with conservative views. As drafted, the law does not apply to conservative social media platforms such as Parler, Gab, and Truth Social.
Florida law originally created a theme park and entertainment exception so that it did not apply to Disney and Universal Studios, which operate within the state. But the state Legislature stripped that protection in 2022 after Disney officials criticized the state’s “Don’t Say I’m Gay” law.
Although the laws in the two states are not identical, there are common themes in both. For example, both contain provisions that limit the choices social media platforms can make about what and how user-generated content is published. For example, Florida law prohibits social media platforms from banning candidates for political office or limiting the visibility of candidates’ posts. Both laws include provisions requiring social media platforms to provide users with personalized explanations for the platforms’ editorial decisions.
Two industry groups representing social media platforms, including Google, which owns YouTube, X (formerly Twitter), and Meta, which owns Facebook, filed suit in federal court to challenge the law.
A federal district judge in Tallahassee, Florida, blocked the state from enforcing most of the law. The U.S. Court of Appeals for the Eleventh Circuit agreed that key provisions of Florida’s law likely violated the First Amendment, and allowed the decision to stand. The state then took the case to the Supreme Court in 2022, asking the justices to weigh in.
A federal judge in Austin, Texas, put the state’s law on hold before it took effect, but the U.S. Court of Appeals for the Fifth Circuit disagreed. This led the tech groups to take the matter to the Supreme Court, which temporarily blocked the law in May 2022, but the tech groups’ appeals continued.
After the Fifth Circuit ultimately upheld the law, the tech groups returned to the Supreme Court, which agreed last fall to review both states’ laws.
States have defended the laws, describing social media platforms as new “digital public squares” with significant control over the news their citizens see and communicate. They say states have historically had the power to protect citizens’ access to that information. And states argue that the ultimate goal of social media platforms is to avoid any regulation. “If accepted, the platform threatens to disempower the people’s representatives to prevent abuse of power against the state,” Florida argues. discourse channel. ”
States argue that their laws do not implicate the First Amendment at all, only requiring social media platforms to: host Speech itself is not speech, but rather an act that the state can regulate in order to protect its citizens. According to the states, the business models of these platforms depend on billions of people being able to post their speech on the platforms. This is very different from, for example, a newspaper company that creates and publishes its own content.
To support this claim that they are merely regulating platform conduct, states, for example, must allow shopping malls to ask high school students to sign political petitions, and federal laws requiring law schools to He points out the Supreme Court’s case law that held. Choosing between giving military recruiters access to campuses and stripping them of federal funding does not violate the First Amendment.
The states also argue that the First Amendment does not apply because the states only treat the platforms like “carriers,” such as phone or wire companies. State law only imposes a basic requirement that carrier platforms must not discriminate in the provision of services, and that “carrier regulation has not been that way for centuries.” It’s been working.”
But even if laws regulate speech, states will continue to not target specific content on any platform, but only to ensure that speakers continue to have access to the “modern public square.” , subject to less stringent screening criteria. ”
Finally, the provision requiring states to individually explain their content moderation decisions from social media platforms means that each state requires companies to provide “purely factual and undisputed information” about their services. It argues that this is consistent with the Supreme Court’s 1985 ruling that it can require disclosure. In fact, Texas has suggested that SMPs can use automated processes to fulfill their obligations under these provisions.
Tech groups are pushing back against the states’ proposals that the Texas and Florida laws have no First Amendment implications. The First Amendment protects the right of private social media platforms, not the government, to decide what messages they will or will not spread, the group wrote. “Just as the state of Florida does not dictate what opinion pieces the NYT publishes or what interviews Fox News broadcasts, it may not dictate what content Facebook or YouTube should disseminate. “We can’t do it,” they emphasize.
The tech group explains that there is a “cacophony of voices on the internet, ranging from incitement and obscenity to political debate and friendly banter.” As a result, social media platforms require him to make billions of editorial decisions a day, they say. These decisions, they observe, take two forms. First, there’s the decision about what content to remove. For example, Facebook restricts hate speech, bullying, and harassment, while YouTube prohibits pornography and violent content. Second, they continue, there are decisions about how the rest of the content appears on the site to individual users.
Tech groups argue that the Texas and Florida laws stifle platforms’ speech because they interfere with their right to exercise editorial discretion. In particular, it emphasizes that the law requires the dissemination of virtually all speech by a state’s preferred speaker, no matter how blatantly or repeatedly the speaker violates the website’s terms of use. ing. ”
And while states rely on a series of lawsuits showing that there is no First Amendment right not to host someone else’s speech, technology groups argue that the Supreme Court has ruled that the First Amendment protects the right. He points to another series of lawsuits in which he admitted to doing so. For example, states cannot require newspapers to give political candidates the right to respond to criticism, or to allow groups to participate if the private organizers of a parade do not approve. This is a group message.
Because the state law’s “raison d’être” is to “override the editorial judgment of ‘Big Tech’ about what speech is allowed on its websites,” the law is subject to the most rigorous forms of scrutiny. Yes, the technology group concludes. , is known for its rigorous examination. And the law does not pass this test, the groups argue. Because even if a state has an interest in giving residents access to a wide range of opinions on social media, that still does not justify requiring private social media platforms to publish content. It is from. they don’t agree.
States also cannot justify regulating social media platforms on the basis that they are carriers, the technology group continues. They say there is no tradition of treating private organizations such as social media platforms as common outlets for speech. But even if they existed, the laws at issue in these cases are not traditional carrier regulations, as they only regulate (among other things) some social media platforms.
Finally, technology groups told the judge that provisions requiring social media platforms to provide separate explanations and disclosures when exercising editorial discretion also require (among other things) platforms to speak and “impose a significant burden.” It states that it is unconstitutional. ”, the platform is less likely to exercise its discretion. This is “the equivalent of asking a newspaper to explain every decision not to publish a million letters to the editor,” the technology group suggested.
The Biden administration has filed a “friend of the court” brief supporting tech companies. We emphasize that while the First Amendment protects social media platforms’ efforts to moderate the content on their sites, that does not mean they can never be regulated. But in such cases, states cannot show that their regulations would survive under the more generous scrutiny of the First Amendment, the report said. And, in particular, U.S. Solicitor General Elizabeth Preloger said the Supreme Court recognized the premise of the states’ arguments: that “the government has a legitimate interest in increasing the diversity of views presented by particular private speakers.” He writes that he has “repeatedly rejected” the idea. Even if that speaker controls a powerful or dominant platform. ”
The Biden administration is scheduled to return to court in March in a separate case involving its own social media relationships.in Marcy vs. MissouriArguments are scheduled for March 18, when the justices will consider whether and to what extent government officials can communicate with social media companies about their content moderation policies.
This article was originally published on How on the Court.