果冻传媒app官方

Opinions

Majority Opinion Author

Elena Kagan

SUPREME COURT OF THE UNITED STATES

Syllabus

MOODY, ATTORNEY GENERAL OF FLORIDA, et al. v. NETCHOICE, LLC, dba NETCHOICE, et al.

CERTIORARI TO THE UNITED STATES COURT OF APPEALS FOR THE ELEVENTH CIRCUIT

No. 22鈥277.鈥傾rgued February 26, 2024 鈥 Decided July 1, 2024*[1]

In 2021, Florida and Texas enacted statutes regulating large social-media companies and other internet platforms. The States鈥 laws differ in the entities they cover and the activities they limit. But both curtail the platforms鈥 capacity to engage in content moderation 鈥 to filter, prioritize, and label the varied third-party messages, videos, and other content their users wish to post. Both laws also include individualized-explanation provisions, requiring a platform to give reasons to a user if it removes or alters her posts.

NetChoice LLC and the Computer & Communications Industry Association (collectively, NetChoice) 鈥 trade associations whose members include Facebook and YouTube 鈥 brought facial First Amendment challenges against the two laws. District courts in both States entered preliminary injunctions.

The Eleventh Circuit upheld the injunction of Florida鈥檚 law, as to all provisions relevant here. The court held that the State鈥檚 restrictions on content moderation trigger First Amendment scrutiny under this Court鈥檚 cases protecting 鈥渆ditorial discretion.鈥 34 F. 4th 1196, 1209, 1216. The court then concluded that the content-moderation provisions are unlikely to survive heightened scrutiny. Id., at 1227鈥1228. Similarly, the Eleventh Circuit thought the statute鈥檚 individualized-explanation requirements likely to fall. Relying on Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio, 471 U.S. 626, the court held that the obligation to explain 鈥渕illions of [decisions] per day鈥 is 鈥渦nduly burdensome and likely to chill platforms鈥 protected speech.鈥 34 F. 4th, at 1230.

The Fifth Circuit disagreed across the board, and so reversed the preliminary injunction of the Texas law. In that court鈥檚 view, the platforms鈥 content-moderation activities are 鈥渘ot speech鈥 at all, and so do not implicate the First Amendment. 49 F. 4th 439, 466, 494. But even if those activities were expressive, the court determined the State could regulate them to advance its interest in 鈥減rotecting a diversity of ideas.鈥 Id., at 482. The court further held that the statute鈥檚 individualized-explanation provisions would likely survive, even assuming the platforms were engaged in speech. It found no undue burden under Zauderer because the platforms needed only to 鈥渟cale up鈥 a 鈥渃omplaint-and-appeal process鈥 they already used. 49 F. 4th, at 487.

Held: The judgments are vacated, and the cases are remanded, because neither the Eleventh Circuit nor the Fifth Circuit conducted a proper analysis of the facial First Amendment challenges to Florida and Texas laws regulating large internet platforms. Pp. 9鈥31.

(a) NetChoice鈥檚 decision to litigate these cases as facial challenges comes at a cost. The Court has made facial challenges hard to win. In the First Amendment context, a plaintiff must show that 鈥渁 substantial number of [the law鈥檚] applications are unconstitutional, judged in relation to the statute鈥檚 plainly legitimate sweep.鈥 Americans for Prosperity Foundation v. Bonta, 594 U.S. 595, 615.

So far in these cases, no one has paid much attention to that issue. Analysis and arguments below focused mainly on how the laws applied to the content-moderation practices that giant social-media platforms use on their best-known services to filter, alter, or label their users鈥 posts, i.e., on how the laws applied to the likes of Facebook鈥檚 News Feed and YouTube鈥檚 homepage. They did not address the full range of activities the laws cover, and measure the constitutional against the unconstitutional applications.

The proper analysis begins with an assessment of the state laws鈥 scope. The laws appear to apply beyond Facebook鈥檚 News Feed and its ilk. But it鈥檚 not clear to what extent, if at all, they affect social-media giants鈥 other services, like direct messaging, or what they have to say about other platforms and functions. And before a court can do anything else with these facial challenges, it must 鈥渄etermine what [the law] covers.鈥 United States v. Hansen, 599 U.S. 762, 770.

The next order of business is to decide which of the laws鈥 applications violate the First Amendment, and to measure them against the rest. For the content-moderation provisions, that means asking, as to every covered platform or function, whether there is an intrusion on protected editorial discretion. And for the individualized-explanation provisions, it means asking, again as to each thing covered, whether the required disclosures unduly burden expression. See Zauderer, 471 U. S., at 651.

Because this is 鈥渁 court of review, not of first view,鈥 Cutter v. Wilkinson, 544 U.S. 709, 718, n. 7, this Court cannot undertake the needed inquiries. And because neither the Eleventh nor the Fifth Circuit performed the facial analysis in the way described above, their decisions must be vacated and the cases remanded. Pp. 9鈥12.

(b) It is necessary to say more about how the First Amendment relates to the laws鈥 content-moderation provisions, to ensure that the facial analysis proceeds on the right path in the courts below. That need is especially stark for the Fifth Circuit, whose decision rested on a serious misunderstanding of First Amendment precedent and principle. Pp. 12鈥29.

(1) The Court has repeatedly held that ordering a party to provide a forum for someone else鈥檚 views implicates the First Amendment if, though only if, the regulated party is engaged in its own expressive activity, which the mandated access would alter or disrupt. First, in Miami Herald Publishing Co. v. Tornillo, 418 U.S. 241, the Court held that a Florida law requiring a newspaper to give a political candidate a right to reply to critical coverage interfered with the newspaper鈥檚 鈥渆xercise of editorial control and judgment.鈥 Id., at 243, 258. Florida could not, the Court explained, override the newspaper鈥檚 decisions about the 鈥渃ontent of the paper鈥 and 鈥淸t]he choice of material to go into鈥 it, because that would substitute 鈥済overnmental regulation鈥 for the 鈥渃rucial process鈥 of editorial choice. Id., at 258. The next case, Pacific Gas & Elec. Co. v. Public Util. Comm鈥檔 of Cal., 475 U.S. 1, involved California鈥檚 attempt to force a private utility to include material from a certain consumer-advocacy group in its regular newsletter to consumers. The Court held that an interest in 鈥渙ffer[ing] the public a greater variety of views鈥 could not justify compelling the utility 鈥渢o carry speech with which it disagreed鈥 and thus to 鈥渁lter its own message.鈥 Id., at 11, n. 7, 12, 16. Then in Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622, the Court considered federal 鈥渕ust-carry鈥 rules, which required cable operators to allocate certain channels to local broadcast stations. The Court had no doubt the First Amendment was implicated, because the rules 鈥渋nterfere[d]鈥 with the cable operators鈥 鈥渆ditorial discretion over which stations or programs to include in [their] repertoire.鈥 Id., at 636, 643鈥644. The capstone of this line of precedents, Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston, Inc., 515 U.S. 557, held that the First Amendment prevented Massachusetts from compelling parade organizers to admit as a participant a gay and lesbian group seeking to convey a message of 鈥減ride.鈥 Id., at 561. It held that ordering the group鈥檚 admittance would 鈥渁lter the expressive content of the[ ] parade,鈥 and that the decision to exclude the group鈥檚 message was the organizers鈥 alone. Id., at 572鈥574.

From that slew of individual cases, three general points emerge. First, the First Amendment offers protection when an entity engaged in compiling and curating others鈥 speech into an expressive product of its own is directed to accommodate messages it would prefer to exclude. Second, none of that changes just because a compiler includes most items and excludes just a few. It 鈥渋s enough鈥 for the compiler to exclude the handful of messages it most 鈥渄isfavor[s].鈥 Hurley, 515 U. S., at 574. Third, the government cannot get its way just by asserting an interest in better balancing the marketplace of ideas. In case after case, the Court has barred the government from forcing a private speaker to present views it wished to spurn in order to rejigger the expressive realm. Pp. 13鈥19.

(2) 鈥淸W]hatever the challenges of applying the Constitution to ever-advancing technology, the basic principles鈥 of the First Amendment 鈥渄o not vary.鈥 Brown v. Entertainment Merchants Assn., 564 U.S. 786, 790. And the principles elaborated in the above-summarized decisions establish that Texas is not likely to succeed in enforcing its law against the platforms鈥 application of their content-moderation policies to their main feeds.

Facebook鈥檚 News Feed and YouTube鈥檚 homepage present users with a continually updating, personalized stream of other users鈥 posts. The key to the scheme is prioritization of content, achieved through algorithms. The selection and ranking is most often based on a user鈥檚 expressed interests and past activities, but it may also be based on other factors, including the platform鈥檚 preferences. Facebook鈥檚 Community Standards and YouTube鈥檚 Community Guidelines detail the messages and videos that the platforms disfavor. The platforms write algorithms to implement those standards 鈥 for example, to prefer content deemed particularly trustworthy or to suppress content viewed as deceptive. Beyond ranking content, platforms may add labels, to give users additional context. And they also remove posts entirely that contain prohibited subjects or messages, such as pornography, hate speech, and misinformation on certain topics. The platforms thus unabashedly control the content that will appear to users.

Texas鈥檚 law, though, limits their power to do so. Its central provision prohibits covered platforms from 鈥渃ensor[ing]鈥 a 鈥渦ser鈥檚 expression鈥 based on the 鈥渧iewpoint鈥 it contains. Tex. Civ. Prac. & Rem. Code Ann. 搂143A.002(a)(2). The platforms thus cannot do any of the things they typically do (on their main feeds) to posts they disapprove 鈥 cannot demote, label, or remove them 鈥 whenever the action is based on the post鈥檚 viewpoint. That limitation profoundly alters the platforms鈥 choices about the views they convey.

The Court has repeatedly held that type of regulation to interfere with protected speech. Like the editors, cable operators, and parade organizers this Court has previously considered, the major social-media platforms curate their feeds by combining 鈥渕ultifarious voices鈥 to create a distinctive expressive offering. Hurley, 515 U. S., at 569. Their choices about which messages are appropriate give the feed a particular expressive quality and 鈥渃onstitute the exercise鈥 of protected 鈥渆ditorial control.鈥 Tornillo, 418 U. S., at 258. And the Texas law targets those expressive choices by forcing the platforms to present and promote content on their feeds that they regard as objectionable.

That those platforms happily convey the lion鈥檚 share of posts submitted to them makes no significant First Amendment difference. In Hurley, the Court held that the parade organizers鈥 鈥渓enient鈥 admissions policy did 鈥渘ot forfeit鈥 their right to reject the few messages they found harmful or offensive. 515 U. S., at 569. Similarly here, that Facebook and YouTube convey a mass of messages does not license Texas to prohibit them from deleting posts they disfavor. Pp. 19鈥26.

(3) The interest Texas relies on cannot sustain its law. In the usual First Amendment case, the Court must decide whether to apply strict or intermediate scrutiny. But here, Texas鈥檚 law does not pass even the less stringent form of review. Under that standard, a law must further a 鈥渟ubstantial governmental interest鈥 that is 鈥渦nrelated to the suppression of free expression.鈥 United States v. 翱鈥橞谤颈别苍, 391 U.S. 367, 377. Many possible interests relating to social media can meet that test. But Texas鈥檚 asserted interest relates to the suppression of free expression, and it is not valid, let alone substantial.

Texas has never been shy, and always been consistent, about its interest: The objective is to correct the mix of viewpoints that major platforms present. But a State may not interfere with private actors鈥 speech to advance its own vision of ideological balance. States (and their citizens) are of course right to want an expressive realm in which the public has access to a wide range of views. But the way the First Amendment achieves that goal is by preventing the government from 鈥渢ilt[ing] public debate in a preferred direction,鈥 Sorrell v. IMS Health Inc., 564 U.S. 552, 578鈥579, not by licensing the government to stop private actors from speaking as they wish and preferring some views over others. A State cannot prohibit speech to rebalance the speech market. That unadorned interest is not 鈥渦nrelated to the suppression of free expression.鈥 And Texas may not pursue it consistent with the First Amendment. Pp. 26鈥29.

No. 22鈥277, 34 F. 4th 1196; No. 22鈥555, 49 F. 4th 439; vacated and remanded.

Kagan, J., delivered the opinion of the Court, in which Roberts, C. J., and Sotomayor, Kavanaugh, and Barrett, JJ., joined in full, and in which Jackson, J., joined as to Parts I, II and III鈥揂. Barrett, J., filed a concurring opinion. Jackson, J., filed an opinion concurring in part and concurring in the judgment. Thomas, J., filed an opinion concurring in the judgment. Alito, J., filed an opinion concurring in the judgment, in which Thomas and Gorsuch, JJ., joined.

Notes

[1] *Together with No. 22鈥555, NetChoice, LLC, dba NetChoice, et al. v. Paxton, Attorney General of Texas, on certiorari to the United States Court of Appeals for the Fifth Circuit.

Share