CopyPublished | Court of Appeals for the Eleventh Circuit
...Satisfies at least one of the following thresholds:
a. Has annual gross revenues in excess of $100
million . . .
b. Has at least 100 million monthly individual
platform participants globally.
Fla. Stat. § 501.2041(1)(g)....
...But after the onset of this litigation—
and after Disney executives made public comments critical of an-
other recently enacted Florida law—the State repealed S.B. 7072’s
theme-park-company exemption. See S.B. 6-C (2022).
The relevant provisions of S.B. 7072—which are codified at
Fla. Stat. §§
106.072 and
501.2041 2—can be divided into three cate-
gories: (1) content-moderation restrictions; (2) disclosure obliga-
tions; and (3) a user-data requirement.
2 While S.B. 7072 also enacted antitrust-related provisions, only §§
106.072 and
501.2041 are at issue in this appeal.
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 10 of 67
10 Opinion of the Court 21-12355
Content-Moderation Restrictions
• Candida...
...The term “deplatform” is defined to mean “the
action or practice by a social media platform to permanently
delete or ban a user or to temporarily delete or ban a user
from the social media platform for more than 14 days.” Id.
§ 501.2041(1)(c).
• Posts by or about candidates: “A social media platform may
not apply or use post-prioritization or shadow banning algo-
rithms for content and material posted by or about . . . a can-
didate.” Id. § 501.2041(2)(h). “Post prioritization” refers to
the practice of arranging certain content in a more or less
prominent position in a user’s feed or search results. Id.
§ 501.2041(1)(e). 3 “Shadow banning” refers to any action to
“limit or eliminate the exposure of a user or content or ma-
terial posted by a user to other users of [a] . . . platform.” Id.
§ 501.2041(1)(f).
• “Journalistic enterprises”: A social-media platform may not
“censor, deplatform, or shadow ban a journalistic enterprise
based on the content of its publication or broadcast.” Id.
§ 501.2041(2)(j)....
...deo online and
has at least 100 million annual viewers, (3) operates a cable
channel that provides more than 40 hours of content per
week to more than 100,000 cable subscribers, or (4) operates
under an FCC broadcast license. Id. § 501.2041(1)(d)....
...The
term “censor” is also defined broadly to include not only ac-
tions taken to “delete,” “edit,” or “inhibit the publication of”
content, but also any effort to “post an addendum to any
content or material.” Id. §
501.2041(1)(b). The only excep-
tion to this provision’s prohibition is for “obscene” content.
Id. §
501.2041(2)(j).
• Consistency: A social-media platform must “apply censor-
ship, deplatforming, and shadow banning standards in a con-
sistent manner among its users on the platform.” Id.
§
501.2041(2)(b). The Act does not define the term “con-
sistent.”
• 30-day restriction: A platform may not make changes to its
“user rules, terms, and agreements . . . more than once every
30 days.” Id. §
501.2041(2)(c).
• User opt-out: A platform must “categorize” its post-prioriti-
zation and shadow-banning algorithms and allow users to
opt out of them; for users who opt out, the platform must
display material in “sequential or chronological” order. Id.
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 12 of 67
12 Opinion of the Court 21-12355
§
501.2041(2)(f). The platform must offer users the oppor-
tunity to opt out annually. Id. §
501.2041(2)(g).
Disclosure Obligations
• Standards: A social-media platform must “publish the stand-
ards, including detailed definitions, it uses or has used for
determining how to censor, deplatform, and shadow ban.”
Id. §
501.2041(2)(a).
• Rule changes: A platform must inform its users “about any
changes to” its “rules, terms, and agreements before imple-
menting the changes.” Id. §
501.2041(2)(c).
• View counts: Upon request, a platform must provide a user
with the number of others who viewed that user’s content
or posts. Id. §
501.2041(2)(e).
• Candidate free advertising: Platforms that “willfully pro-
vide[] free advertising for a candidate must inform the can-
didate of such in-kind contribution.” Id. §
106.072(4).
• Explanations: Before a social-media platform deplatforms,
censors, or shadow-bans any user, it must provide the user
with a detailed notice. Id. §
501.2041(2)(d)....
...must include both a “thorough rationale explaining the rea-
son” for the “censor[ship]” and a “precise and thorough ex-
planation of how the social media platform became aware”
of the content that triggered its decision. Id. §
501.2041(3).
(The notice requirement doesn’t apply “if the censored con-
tent or material is obscene.” Id. §
501.2041(4).)
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 13 of 67
21-12355 Opinion of the Court 13
User-Data Requirement
• Data access: A social-media platform must allow a deplat-
formed user to “access or retrieve all of the user’s infor-
mation, content, material, and data for at least 60 days” after
the user receives notice of deplatforming. Id.
§
501.2041(2)(i).
Enforcement of §
106.072—which contains the candidate-
deplatforming provision—falls to the Florida Elections Commis-
sion, which is empowered to impose fines of up to $250,000 per day
for violations involving candidates for statewide office and $25,000
per day for those involving candidates for other offices. Id.
§
106.072(3). Section
501.2041—which contains S.B. 7072’s remain-
ing provisions—may be enforced either by state governmental ac-
tors or through civil suits filed by private parties. Id. §
501.2041(5),
(6). Private actions under this section can yield up to $100,000 in
statutory damages per claim, actual damages, punitive damages,
equitable relief, and, in some instances, attorneys’ fees. Id.
§
501.2041(6).
C
The plaintiffs here—NetChoice and the Computer & Com-
munications Industry Association (together, “NetChoice”)—are
trade associations that represent internet and social-media compa-
nies like Facebook, Twitter, Google (which owns YouTube), and
TikTok....
...They sued the Florida officials charged with enforcing
S.B. 7072 under 42 U.S.C. § 1983. In particular, they sought to
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 14 of 67
14 Opinion of the Court 21-12355
enjoin enforcement of §§
106.072 and
501.2041 on a number of
grounds, including, as relevant here, that the law’s provisions
(1) violate the social-media companies’ right to free speech under
the First Amendment and (2) are preempted by federal law.
The district court granted NetChoice’s motion and prelimi-
narily enjoined enforcement of §§
106.072 and
501.2041 in their en-
tirety....
...Gonzalez,
978 F.3d at 1271 n.12 (quotation
marks omitted).
* * *
We will train our attention on the question whether
NetChoice has shown a substantial likelihood of success on the
merits of its First Amendment challenge to Fla. Stat. §§
106.072 and
501.2041....
...As an initial matter, in at least one key provision, the Act de-
fines the term “censor” to include “posting an addendum,” i.e., a
disclaimer—and thereby explicitly prohibits the very speech by
which a platform might dissociate itself from users’ messages. Fla.
Stat. § 501.2041(1)(b)....
...7072’s content-moderation restrictions all limit plat-
forms’ ability to exercise editorial judgment and thus trigger First
Amendment scrutiny. The provisions that prohibit deplatforming
candidates (§
106.072(2)), deprioritizing and “shadow-banning”
content by or about candidates (§
501.2041(2)(h)), and censoring,
deplatforming, or shadow-banning “journalistic enterprises”
(§
501.2041(2)(j)) all clearly restrict platforms’ editorial judgment
by preventing them from removing or deprioritizing content or us-
ers and forcing them to disseminate messages that they find objec-
tionable.
The consistency requirement (§
501.2041(2)(b)) and the 30-
day restriction (§
501.2041(2)(c)) also—if somewhat less obvi-
ously—restrict editorial judgment....
...might find to be similar. These provisions thus burden platforms’
right to make editorial judgments on a case-by-case basis or to
change the types of content they’ll disseminate—and, hence, the
messages they express.
The user-opt-out requirement (§ 501.2041(2)(f), (g)) also
triggers First Amendment scrutiny because it forces platforms,
upon a user’s request, not to exercise the editorial discretion that
they otherwise would in curating content—prioritizing some posts
and deprioritizing others—in the user’s feed....
...21-12355
explain. See Zauderer,
471 U.S. at 651; Nat’l Inst. of Fam. & Life
Advocs. v. Becerra,
138 S. Ct. 2361, 2378 (2018) (“NIFLA”).
Finally, the exception: We hold that S.B. 7072’s user-data-
access requirement (§
501.2041(2)(i)) does not trigger First Amend-
ment scrutiny....
...It’s also true that the Act applies only to a subset of
speakers consisting of the largest social-media platforms and that
the law’s enacted findings refer to the platforms’ allegedly “unfair”
censorship. See S.B. 7072 § (9), (10); Fla. Stat. § 501.2041(1)(g)....
...Some of these provisions are self-evidently
content-based and thus subject to strict scrutiny. The journalistic-
enterprises provision, for instance, prohibits a platform from mak-
ing content-moderation decisions concerning any “journalistic en-
terprise based on the content of ” its posts, Fla. Stat. § 501.2041(2)(j)
(emphasis added), and thus applies “because of the ....
...at 163: Re-
moving a journalistic enterprise’s post, for instance, because it is
duplicative or too big is permissible, but removing a post to com-
municate disapproval of its content isn’t. Similarly, the restriction
on deprioritizing posts “about . . . a candidate,” id. §
501.2041(2)(h),
regulates speech based on “the topic discussed,” Reed,
576 U.S. at
163, and is therefore clearly content-based. At the other end of the
spectrum, the candidate-deplatforming (§
106.072(2)) and user-opt-
out (§
501.2041(2)(f), (g)) provisions are pretty obviously content-
neutral....
...tion depends in any way on the substance of platforms’ content-
moderation decisions.
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 56 of 67
56 Opinion of the Court 21-12355
Some of the provisions—for instance, § 501.2041(2)(b)’s re-
quirement that platforms exercise their content-moderation au-
thority “consistently”—may exhibit both content-based and con-
tent-neutral characteristics....
...iry or
a stricter form of judicial scrutiny is applied . . . there is no need to
determine whether all speech hampered by [the law] is commer-
cial”).
A different standard applies to S.B. 7072’s disclosure provi-
sions—§
106.072(4) and §
501.2041(2)(a), (c), (e), (4)....
...We hold that it is substantially likely that none of S.B.
7072’s content-moderation restrictions survive intermediate—let
alone strict—scrutiny. We further hold that there is a substantial
likelihood that the “thorough explanation” disclosure requirement
(§ 501.2041(2)(d)) is unconstitutional....
...22 Page: 60 of 67
60 Opinion of the Court 21-12355
that S.B. 7072 ensures that political candidates and journalistic en-
terprises are able to communicate with the public, see Fla. Stat.
§§
106.072(2);
501.2041(2)(f), (j)....
...content they find objectionable—to “enhance the relative voice” of
certain candidates and journalistic enterprises. Buckley,
424 U.S. at
48–49.
There is also a substantial likelihood that the consistency, 30-
day, and user-opt-out provisions (§
501.2041(2)(b), (c), (f), (g)) fail
to advance substantial governmental interests....
...2 Page: 61 of 67
21-12355 Opinion of the Court 61
sufficient to justify requiring private actors to apply their content-
moderation policies—to speak—“consistently.” See
§ 501.2041(2)(b)....
...Second, there is likely no
governmental interest sufficient to justify prohibiting a platform
from changing its content-moderation policies—i.e., prohibiting a
private speaker from changing the messages it expresses—more
than once every 30 days. See § 501.2041(2)(c). Finally, there is
likely no governmental interest sufficient to justify forcing plat-
forms to show content to users in a “sequential or chronological”
order, see § 501.2041(2)(f), (g)—a requirement that would prevent
platforms from expressing messages through post-prioritization
and shadow banning.
Moreover, and in any event, even if the State could establish
that its content-moderation restriction...
...mental interest, it hasn’t even attempted to—and we don’t think it
could—show that the burden that those provisions impose is “no
greater than is essential to the furtherance of that interest.”
O’Brien,
391 U.S. at 377. For instance, §§
106.072(2) and
501.2041(2)(h) prohibit deplatforming, deprioritizing, or shadow-
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 62 of 67
62 Opinion of the Court 21-12355
banning candidates regardless...
...The journalistic-enterprises provision requires platforms
to allow any entity with enough content and a sufficient number
of users to post anything it wants—other than true “obscen[ity]”—
and even prohibits platforms from adding disclaimers or warnings.
See Fla. Stat. § 501.2041(2)(j)....
...23 That seems to
us the opposite of narrow tailoring.
We conclude that NetChoice has shown a substantial likeli-
hood of success on the merits of its claim that S.B. 7072’s content-
moderation restrictions—in Fla. Stat. §§
106.072(2),
501.2041(2)(b),
(c), (f), (g), (h), (j)—violate the First Amendment.
23Even worse, S.B....
...USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 63 of 67
21-12355 Opinion of the Court 63
2
We assess S.B. 7072’s disclosure requirements—in
§§
106.072(4),
501.2041(2)(a), (c), (d), (e))—under the Zauderer
standard: A commercial disclosure requirement must be “reason-
ably related to the State’s interest in preventing deception of con-
sumers” and must not be “[u]njustified or unduly burdensome”
such that it would “chill[] protected speech.” Milavetz, 559 U.S....
...led
about platforms’ content-moderation policies. 24 This interest is
likely legitimate. On the ensuing burden question, NetChoice
hasn’t established a substantial likelihood that the provisions that
require platforms to publish their standards (§
501.2041(2)(a)),
24 This interest likely applies to all of the disclosure provisions with the possi-
ble exception of the candidate-free-advertising provision (§
106.072(4))....
...NetChoice hasn’t shown that it is substantially likely to be unconstitutional.
USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 64 of 67
64 Opinion of the Court 21-12355
inform users about changes to their rules (§
501.2041(2)(c)), pro-
vide users with view counts for their posts, (§
501.2041(2)(e)), and
inform candidates about free advertising (§
106.072(4)), are unduly
burdensome or likely to chill platforms’ speech. So, these provi-
sions aren’t substantially likely to be unconstitutional. 25
But NetChoice does argue that §
501.2041(2)(d)—the re-
quirement that platforms provide notice and a detailed justification
for every content-moderation action—is “practically impossible to
satisfy.” Br....
...For every one of these
actions, the law requires a platform to provide written notice de-
livered within seven days, including a “thorough rationale” for the
decision and a “precise and thorough explanation of how [it] be-
came aware” of the material. See § 501.2041(3)....
...This requirement
not only imposes potentially significant implementation costs but
also exposes platforms to massive liability: The law provides for
up to $100,000 in statutory damages per claim and pegs liability to
vague terms like “thorough” and “precise.” See § 501.2041(6)(a).
Thus, a platform could be slapped with millions, or even billions,
25Of course, NetChoice still might establish during the course of litigation
that these provisions are unduly burdensome and therefore unconstitutional.
USCA11 C...
...that it didn’t provide sufficiently “thorough” explanations when re-
moving posts. It is substantially likely that this massive potential
liability is “unduly burdensome” and would “chill[] protected
speech”—platforms’ exercise of editorial judgment—such that
§
501.2041(2)(d) violates platforms’ First Amendment rights. Mila-
vetz,
559 U.S. at 250.
* * *
It is substantially likely that S.B. 7072’s content-moderation
restrictions (§§
106.072(2),
501.2041(2)(b), (c), (f), (g), (h), (j)) and its
requirement that platforms provide a thorough rationale for every
content-moderation action (§
501.2041(2)(d)) violate the First
Amendment. The same is not true of the Act’s other disclosure
provisions (§§
106.072(4),
501.2041(2)(a), (c), (e)) and its user-data-
access provision (§
501.2041(2)(i))....
...Provision Fla. Stat. § Likely Disposi-
Constitutionality tion
Candidate
106.072(2) Unconstitutional Affirm
deplatforming
Posts by/about
501.2041(2)(h) Unconstitutional Affirm
candidates
“Journalistic
501.2041(2)(j) Unconstitutional Affirm
enterprises”
Consistency
501.2041(2)(b) Unconstitutional Affirm
30-day restriction
501.2041(2)(c) Unconstitutional Affirm
User opt-out
501.2041(2)(f),(g) Unconstitutional Affirm
Explanations
501.2041(2)(d) Unconstitutional Affirm
(per decision)
Standards
501.2041(2)(a) Constitutional Vacate
Rule changes
501.2041(2)(c) Constitutional Vacate
User view counts
501.2041(2)(e) Constitutional Vacate
Candidate “free adver-
106.072(4) Constitutional Vacate
tising”
User-data access
501.2041(2)(i) Constitutional Vacate