Florida/Georgia Personal Injury & Workers Compensation

You're probably overthinking it. Call a lawyer.

Call Now: 904-383-7448
Florida Statute 501.2041 - Full Text and Legal Analysis
Florida Statute 501.2041 | Lawyer Caselaw & Research
Link to State of Florida Official Statute
F.S. 501.2041 Case Law from Google Scholar Google Search for Amendments to 501.2041

The 2025 Florida Statutes

Title XXXIII
REGULATION OF TRADE, COMMERCE, INVESTMENTS, AND SOLICITATIONS
Chapter 501
CONSUMER PROTECTION
View Entire Chapter
501.2041 Unlawful acts and practices by social media platforms.
(1) As used in this section, the term:
(a) “Algorithm” means a mathematical set of rules that specifies how a group of data behaves and that will assist in ranking search results and maintaining order or that is used in sorting or ranking content or material based on relevancy or other factors instead of using published time or chronological order of such content or material.
(b) “Censor” includes any action taken by a social media platform to delete, regulate, restrict, edit, alter, inhibit the publication or republication of, suspend a right to post, remove, or post an addendum to any content or material posted by a user. The term also includes actions to inhibit the ability of a user to be viewable by or to interact with another user of the social media platform.
(c) “Deplatform” means the action or practice by a social media platform to permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days.
(d) “Journalistic enterprise” means an entity doing business in Florida that:
1. Publishes in excess of 100,000 words available online with at least 50,000 paid subscribers or 100,000 monthly active users;
2. Publishes 100 hours of audio or video available online with at least 100 million viewers annually;
3. Operates a cable channel that provides more than 40 hours of content per week to more than 100,000 cable television subscribers; or
4. Operates under a broadcast license issued by the Federal Communications Commission.
(e) “Post-prioritization” means action by a social media platform to place, feature, or prioritize certain content or material ahead of, below, or in a more or less prominent position than others in a newsfeed, a feed, a view, or in search results. The term does not include post-prioritization of content and material of a third party, including other users, based on payments by that third party, to the social media platform.
(f) “Shadow ban” means action by a social media platform, through any means, whether the action is determined by a natural person or an algorithm, to limit or eliminate the exposure of a user or content or material posted by a user to other users of the social media platform. This term includes acts of shadow banning by a social media platform which are not readily apparent to a user.
(g) “Social media platform” means any information service, system, Internet search engine, or access software provider that:
1. Provides or enables computer access by multiple users to a computer server, including an Internet platform or a social media site;
2. Operates as a sole proprietorship, partnership, limited liability company, corporation, association, or other legal entity;
3. Does business in the state; and
4. Satisfies at least one of the following thresholds:
a. Has annual gross revenues in excess of $100 million, as adjusted in January of each odd-numbered year to reflect any increase in the Consumer Price Index.
b. Has at least 100 million monthly individual platform participants globally.
(h) “User” means a person who resides or is domiciled in this state and who has an account on a social media platform, regardless of whether the person posts or has posted content or material to the social media platform.
(2) A social media platform that fails to comply with any of the provisions of this subsection commits an unfair or deceptive act or practice as specified in s. 501.204.
(a) A social media platform must publish the standards, including detailed definitions, it uses or has used for determining how to censor, deplatform, and shadow ban.
(b) A social media platform must apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform.
(c) A social media platform must inform each user about any changes to its user rules, terms, and agreements before implementing the changes and may not make changes more than once every 30 days.
(d) A social media platform may not censor or shadow ban a user’s content or material or deplatform a user from the social media platform:
1. Without notifying the user who posted or attempted to post the content or material; or
2. In a way that violates this part.
(e) A social media platform must:
1. Provide a mechanism that allows a user to request the number of other individual platform participants who were provided or shown the user’s content or posts.
2. Provide, upon request, a user with the number of other individual platform participants who were provided or shown content or posts.
(f) A social media platform must:
1. Categorize algorithms used for post-prioritization and shadow banning.
2. Allow a user to opt out of post-prioritization and shadow banning algorithm categories to allow sequential or chronological posts and content.
(g) A social media platform must provide users with an annual notice on the use of algorithms for post-prioritization and shadow banning and reoffer annually the opt-out opportunity in subparagraph (f)2.
(h) A social media platform may not apply or use post-prioritization or shadow banning algorithms for content and material posted by or about a user who is known by the social media platform to be a candidate as defined in s. 106.011(3)(e), beginning on the date of qualification and ending on the date of the election or the date the candidate ceases to be a candidate. Post-prioritization of certain content or material from or about a candidate for office based on payments to the social media platform by such candidate for office or a third party is not a violation of this paragraph. A social media platform must provide each user a method by which the user may be identified as a qualified candidate and which provides sufficient information to allow the social media platform to confirm the user’s qualification by reviewing the website of the Division of Elections or the website of the local supervisor of elections.
(i) A social media platform must allow a user who has been deplatformed to access or retrieve all of the user’s information, content, material, and data for at least 60 days after the user receives the notice required under subparagraph (d)1.
(j) A social media platform may not take any action to censor, deplatform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast. Post-prioritization of certain journalistic enterprise content based on payments to the social media platform by such journalistic enterprise is not a violation of this paragraph. This paragraph does not apply if the content or material is obscene as defined in s. 847.001.
(3) For purposes of subparagraph (2)(d)1., a notification must:
(a) Be in writing.
(b) Be delivered via electronic mail or direct electronic notification to the user within 7 days after the censoring action.
(c) Include a thorough rationale explaining the reason that the social media platform censored the user.
(d) Include a precise and thorough explanation of how the social media platform became aware of the censored content or material, including a thorough explanation of the algorithms used, if any, to identify or flag the user’s content or material as objectionable.
(4) Notwithstanding any other provisions of this section, a social media platform is not required to notify a user if the censored content or material is obscene as defined in s. 847.001.
(5) If the department, by its own inquiry or as a result of a complaint, suspects that a violation of this section is imminent, occurring, or has occurred, the department may investigate the suspected violation in accordance with this part. Based on its investigation, the department may bring a civil or administrative action under this part. For the purpose of bringing an action pursuant to this section, ss. 501.211 and 501.212 do not apply.
(6) A user may only bring a private cause of action for violations of paragraph (2)(b) or subparagraph (2)(d)1. In a private cause of action brought under paragraph (2)(b) or subparagraph (2)(d)1., the court may award the following remedies to the user:
(a) Up to $100,000 in statutory damages per proven claim.
(b) Actual damages.
(c) If aggravating factors are present, punitive damages.
(d) Other forms of equitable relief, including injunctive relief.
(e) If the user was deplatformed in violation of paragraph (2)(b), costs and reasonable attorney fees.
(7) For purposes of bringing an action in accordance with subsections (5) and (6), each failure to comply with the individual provisions of subsection (2) shall be treated as a separate violation, act, or practice. For purposes of bringing an action in accordance with subsections (5) and (6), a social media platform that censors, shadow bans, deplatforms, or applies post-prioritization algorithms to candidates and users in the state is conclusively presumed to be both engaged in substantial and not isolated activities within the state and operating, conducting, engaging in, or carrying on a business, and doing business in this state, and is therefore subject to the jurisdiction of the courts of the state.
(8) In an investigation by the department into alleged violations of this section, the department’s investigative powers include, but are not limited to, the ability to subpoena any algorithm used by a social media platform related to any alleged violation.
(9) This section may only be enforced to the extent not inconsistent with federal law and 47 U.S.C. s. 230(e)(3), and notwithstanding any other provision of state law.
(10)(a) All information received by the department pursuant to an investigation by the department or a law enforcement agency of a violation of this section is confidential and exempt from s. 119.07(1) and s. 24(a), Art. I of the State Constitution until such time as the investigation is completed or ceases to be active. This exemption shall be construed in conformity with s. 119.071(2)(c).
(b) During an active investigation, information made confidential and exempt pursuant to paragraph (a) may be disclosed by the department:
1. In the performance of its official duties and responsibilities; or
2. To another governmental entity in performance of its official duties and responsibilities.
(c) Once an investigation is completed or ceases to be active, the following information received by the department shall remain confidential and exempt from s. 119.07(1) and s. 24(a), Art. I of the State Constitution:
1. All information to which another public records exemption applies.
2. Personal identifying information.
3. A computer forensic report.
4. Information that would otherwise reveal weaknesses in a business’s data security.
5. Proprietary business information.
(d) For purposes of this subsection, the term “proprietary business information” means information that:
1. Is owned or controlled by the business;
2. Is intended to be private and is treated by the business as private because disclosure would harm the business or its business operations;
3. Has not been disclosed except as required by law or a private agreement that provides that the information will not be released to the public;
4. Is not publicly available or otherwise readily ascertainable through proper means from another source in the same configuration as received by the department; and
5. Includes:
a. Trade secrets as defined in s. 688.002.
b. Competitive interests, the disclosure of which would impair the competitive advantage of the business that is the subject of the information.
(e) This subsection is subject to the Open Government Sunset Review Act in accordance with s. 119.15 and shall stand repealed on October 2, 2026, unless reviewed and saved from repeal through reenactment by the Legislature.
History.s. 4, ch. 2021-32; s. 2, ch. 2021-33; s. 1, ch. 2022-267.

F.S. 501.2041 on Google Scholar

F.S. 501.2041 on CourtListener

Amendments to 501.2041


Annotations, Discussions, Cases:

Cases Citing Statute 501.2041

Total Results: 1  |  Sort by: Relevance  |  Newest First

Copy

NetChoice, LLC v. Attorney Gen., State of Florida (11th Cir. 2022).

Published | Court of Appeals for the Eleventh Circuit

...Satisfies at least one of the following thresholds: a. Has annual gross revenues in excess of $100 million . . . b. Has at least 100 million monthly individual platform participants globally. Fla. Stat. § 501.2041(1)(g)....
...But after the onset of this litigation— and after Disney executives made public comments critical of an- other recently enacted Florida law—the State repealed S.B. 7072’s theme-park-company exemption. See S.B. 6-C (2022). The relevant provisions of S.B. 7072—which are codified at Fla. Stat. §§ 106.072 and 501.2041 2—can be divided into three cate- gories: (1) content-moderation restrictions; (2) disclosure obliga- tions; and (3) a user-data requirement. 2 While S.B. 7072 also enacted antitrust-related provisions, only §§ 106.072 and 501.2041 are at issue in this appeal. USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 10 of 67 10 Opinion of the Court 21-12355 Content-Moderation Restrictions • Candida...
...The term “deplatform” is defined to mean “the action or practice by a social media platform to permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days.” Id. § 501.2041(1)(c). • Posts by or about candidates: “A social media platform may not apply or use post-prioritization or shadow banning algo- rithms for content and material posted by or about . . . a can- didate.” Id. § 501.2041(2)(h). “Post prioritization” refers to the practice of arranging certain content in a more or less prominent position in a user’s feed or search results. Id. § 501.2041(1)(e). 3 “Shadow banning” refers to any action to “limit or eliminate the exposure of a user or content or ma- terial posted by a user to other users of [a] . . . platform.” Id. § 501.2041(1)(f). • “Journalistic enterprises”: A social-media platform may not “censor, deplatform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast.” Id. § 501.2041(2)(j)....
...deo online and has at least 100 million annual viewers, (3) operates a cable channel that provides more than 40 hours of content per week to more than 100,000 cable subscribers, or (4) operates under an FCC broadcast license. Id. § 501.2041(1)(d)....
...The term “censor” is also defined broadly to include not only ac- tions taken to “delete,” “edit,” or “inhibit the publication of” content, but also any effort to “post an addendum to any content or material.” Id. § 501.2041(1)(b). The only excep- tion to this provision’s prohibition is for “obscene” content. Id. § 501.2041(2)(j). • Consistency: A social-media platform must “apply censor- ship, deplatforming, and shadow banning standards in a con- sistent manner among its users on the platform.” Id. § 501.2041(2)(b). The Act does not define the term “con- sistent.” • 30-day restriction: A platform may not make changes to its “user rules, terms, and agreements . . . more than once every 30 days.” Id. § 501.2041(2)(c). • User opt-out: A platform must “categorize” its post-prioriti- zation and shadow-banning algorithms and allow users to opt out of them; for users who opt out, the platform must display material in “sequential or chronological” order. Id. USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 12 of 67 12 Opinion of the Court 21-12355 § 501.2041(2)(f). The platform must offer users the oppor- tunity to opt out annually. Id. § 501.2041(2)(g). Disclosure Obligations • Standards: A social-media platform must “publish the stand- ards, including detailed definitions, it uses or has used for determining how to censor, deplatform, and shadow ban.” Id. § 501.2041(2)(a). • Rule changes: A platform must inform its users “about any changes to” its “rules, terms, and agreements before imple- menting the changes.” Id. § 501.2041(2)(c). • View counts: Upon request, a platform must provide a user with the number of others who viewed that user’s content or posts. Id. § 501.2041(2)(e). • Candidate free advertising: Platforms that “willfully pro- vide[] free advertising for a candidate must inform the can- didate of such in-kind contribution.” Id. § 106.072(4). • Explanations: Before a social-media platform deplatforms, censors, or shadow-bans any user, it must provide the user with a detailed notice. Id. § 501.2041(2)(d)....
...must include both a “thorough rationale explaining the rea- son” for the “censor[ship]” and a “precise and thorough ex- planation of how the social media platform became aware” of the content that triggered its decision. Id. § 501.2041(3). (The notice requirement doesn’t apply “if the censored con- tent or material is obscene.” Id. § 501.2041(4).) USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 13 of 67 21-12355 Opinion of the Court 13 User-Data Requirement • Data access: A social-media platform must allow a deplat- formed user to “access or retrieve all of the user’s infor- mation, content, material, and data for at least 60 days” after the user receives notice of deplatforming. Id. § 501.2041(2)(i). Enforcement of § 106.072—which contains the candidate- deplatforming provision—falls to the Florida Elections Commis- sion, which is empowered to impose fines of up to $250,000 per day for violations involving candidates for statewide office and $25,000 per day for those involving candidates for other offices. Id. § 106.072(3). Section 501.2041—which contains S.B. 7072’s remain- ing provisions—may be enforced either by state governmental ac- tors or through civil suits filed by private parties. Id. § 501.2041(5), (6). Private actions under this section can yield up to $100,000 in statutory damages per claim, actual damages, punitive damages, equitable relief, and, in some instances, attorneys’ fees. Id. § 501.2041(6). C The plaintiffs here—NetChoice and the Computer & Com- munications Industry Association (together, “NetChoice”)—are trade associations that represent internet and social-media compa- nies like Facebook, Twitter, Google (which owns YouTube), and TikTok....
...They sued the Florida officials charged with enforcing S.B. 7072 under 42 U.S.C. § 1983. In particular, they sought to USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 14 of 67 14 Opinion of the Court 21-12355 enjoin enforcement of §§ 106.072 and 501.2041 on a number of grounds, including, as relevant here, that the law’s provisions (1) violate the social-media companies’ right to free speech under the First Amendment and (2) are preempted by federal law. The district court granted NetChoice’s motion and prelimi- narily enjoined enforcement of §§ 106.072 and 501.2041 in their en- tirety....
...Gonzalez, 978 F.3d at 1271 n.12 (quotation marks omitted). * * * We will train our attention on the question whether NetChoice has shown a substantial likelihood of success on the merits of its First Amendment challenge to Fla. Stat. §§ 106.072 and 501.2041....
...As an initial matter, in at least one key provision, the Act de- fines the term “censor” to include “posting an addendum,” i.e., a disclaimer—and thereby explicitly prohibits the very speech by which a platform might dissociate itself from users’ messages. Fla. Stat. § 501.2041(1)(b)....
...7072’s content-moderation restrictions all limit plat- forms’ ability to exercise editorial judgment and thus trigger First Amendment scrutiny. The provisions that prohibit deplatforming candidates (§ 106.072(2)), deprioritizing and “shadow-banning” content by or about candidates (§ 501.2041(2)(h)), and censoring, deplatforming, or shadow-banning “journalistic enterprises” (§ 501.2041(2)(j)) all clearly restrict platforms’ editorial judgment by preventing them from removing or deprioritizing content or us- ers and forcing them to disseminate messages that they find objec- tionable. The consistency requirement (§ 501.2041(2)(b)) and the 30- day restriction (§ 501.2041(2)(c)) also—if somewhat less obvi- ously—restrict editorial judgment....
...might find to be similar. These provisions thus burden platforms’ right to make editorial judgments on a case-by-case basis or to change the types of content they’ll disseminate—and, hence, the messages they express. The user-opt-out requirement (§ 501.2041(2)(f), (g)) also triggers First Amendment scrutiny because it forces platforms, upon a user’s request, not to exercise the editorial discretion that they otherwise would in curating content—prioritizing some posts and deprioritizing others—in the user’s feed....
...21-12355 explain. See Zauderer, 471 U.S. at 651; Nat’l Inst. of Fam. & Life Advocs. v. Becerra, 138 S. Ct. 2361, 2378 (2018) (“NIFLA”). Finally, the exception: We hold that S.B. 7072’s user-data- access requirement (§ 501.2041(2)(i)) does not trigger First Amend- ment scrutiny....
...It’s also true that the Act applies only to a subset of speakers consisting of the largest social-media platforms and that the law’s enacted findings refer to the platforms’ allegedly “unfair” censorship. See S.B. 7072 § (9), (10); Fla. Stat. § 501.2041(1)(g)....
...Some of these provisions are self-evidently content-based and thus subject to strict scrutiny. The journalistic- enterprises provision, for instance, prohibits a platform from mak- ing content-moderation decisions concerning any “journalistic en- terprise based on the content of ” its posts, Fla. Stat. § 501.2041(2)(j) (emphasis added), and thus applies “because of the ....
...at 163: Re- moving a journalistic enterprise’s post, for instance, because it is duplicative or too big is permissible, but removing a post to com- municate disapproval of its content isn’t. Similarly, the restriction on deprioritizing posts “about . . . a candidate,” id. § 501.2041(2)(h), regulates speech based on “the topic discussed,” Reed, 576 U.S. at 163, and is therefore clearly content-based. At the other end of the spectrum, the candidate-deplatforming (§ 106.072(2)) and user-opt- out (§ 501.2041(2)(f), (g)) provisions are pretty obviously content- neutral....
...tion depends in any way on the substance of platforms’ content- moderation decisions. USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 56 of 67 56 Opinion of the Court 21-12355 Some of the provisions—for instance, § 501.2041(2)(b)’s re- quirement that platforms exercise their content-moderation au- thority “consistently”—may exhibit both content-based and con- tent-neutral characteristics....
...iry or a stricter form of judicial scrutiny is applied . . . there is no need to determine whether all speech hampered by [the law] is commer- cial”). A different standard applies to S.B. 7072’s disclosure provi- sions—§ 106.072(4) and § 501.2041(2)(a), (c), (e), (4)....
...We hold that it is substantially likely that none of S.B. 7072’s content-moderation restrictions survive intermediate—let alone strict—scrutiny. We further hold that there is a substantial likelihood that the “thorough explanation” disclosure requirement (§ 501.2041(2)(d)) is unconstitutional....
...22 Page: 60 of 67 60 Opinion of the Court 21-12355 that S.B. 7072 ensures that political candidates and journalistic en- terprises are able to communicate with the public, see Fla. Stat. §§ 106.072(2); 501.2041(2)(f), (j)....
...content they find objectionable—to “enhance the relative voice” of certain candidates and journalistic enterprises. Buckley, 424 U.S. at 48–49. There is also a substantial likelihood that the consistency, 30- day, and user-opt-out provisions (§ 501.2041(2)(b), (c), (f), (g)) fail to advance substantial governmental interests....
...2 Page: 61 of 67 21-12355 Opinion of the Court 61 sufficient to justify requiring private actors to apply their content- moderation policies—to speak—“consistently.” See § 501.2041(2)(b)....
...Second, there is likely no governmental interest sufficient to justify prohibiting a platform from changing its content-moderation policies—i.e., prohibiting a private speaker from changing the messages it expresses—more than once every 30 days. See § 501.2041(2)(c). Finally, there is likely no governmental interest sufficient to justify forcing plat- forms to show content to users in a “sequential or chronological” order, see § 501.2041(2)(f), (g)—a requirement that would prevent platforms from expressing messages through post-prioritization and shadow banning. Moreover, and in any event, even if the State could establish that its content-moderation restriction...
...mental interest, it hasn’t even attempted to—and we don’t think it could—show that the burden that those provisions impose is “no greater than is essential to the furtherance of that interest.” O’Brien, 391 U.S. at 377. For instance, §§ 106.072(2) and 501.2041(2)(h) prohibit deplatforming, deprioritizing, or shadow- USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 62 of 67 62 Opinion of the Court 21-12355 banning candidates regardless...
...The journalistic-enterprises provision requires platforms to allow any entity with enough content and a sufficient number of users to post anything it wants—other than true “obscen[ity]”— and even prohibits platforms from adding disclaimers or warnings. See Fla. Stat. § 501.2041(2)(j)....
...23 That seems to us the opposite of narrow tailoring. We conclude that NetChoice has shown a substantial likeli- hood of success on the merits of its claim that S.B. 7072’s content- moderation restrictions—in Fla. Stat. §§ 106.072(2), 501.2041(2)(b), (c), (f), (g), (h), (j)—violate the First Amendment. 23Even worse, S.B....
... USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 63 of 67 21-12355 Opinion of the Court 63 2 We assess S.B. 7072’s disclosure requirements—in §§ 106.072(4), 501.2041(2)(a), (c), (d), (e))—under the Zauderer standard: A commercial disclosure requirement must be “reason- ably related to the State’s interest in preventing deception of con- sumers” and must not be “[u]njustified or unduly burdensome” such that it would “chill[] protected speech.” Milavetz, 559 U.S....
...led about platforms’ content-moderation policies. 24 This interest is likely legitimate. On the ensuing burden question, NetChoice hasn’t established a substantial likelihood that the provisions that require platforms to publish their standards (§ 501.2041(2)(a)), 24 This interest likely applies to all of the disclosure provisions with the possi- ble exception of the candidate-free-advertising provision (§ 106.072(4))....
...NetChoice hasn’t shown that it is substantially likely to be unconstitutional. USCA11 Case: 21-12355 Date Filed: 05/23/2022 Page: 64 of 67 64 Opinion of the Court 21-12355 inform users about changes to their rules (§ 501.2041(2)(c)), pro- vide users with view counts for their posts, (§ 501.2041(2)(e)), and inform candidates about free advertising (§ 106.072(4)), are unduly burdensome or likely to chill platforms’ speech. So, these provi- sions aren’t substantially likely to be unconstitutional. 25 But NetChoice does argue that § 501.2041(2)(d)—the re- quirement that platforms provide notice and a detailed justification for every content-moderation action—is “practically impossible to satisfy.” Br....
...For every one of these actions, the law requires a platform to provide written notice de- livered within seven days, including a “thorough rationale” for the decision and a “precise and thorough explanation of how [it] be- came aware” of the material. See § 501.2041(3)....
...This requirement not only imposes potentially significant implementation costs but also exposes platforms to massive liability: The law provides for up to $100,000 in statutory damages per claim and pegs liability to vague terms like “thorough” and “precise.” See § 501.2041(6)(a). Thus, a platform could be slapped with millions, or even billions, 25Of course, NetChoice still might establish during the course of litigation that these provisions are unduly burdensome and therefore unconstitutional. USCA11 C...
...that it didn’t provide sufficiently “thorough” explanations when re- moving posts. It is substantially likely that this massive potential liability is “unduly burdensome” and would “chill[] protected speech”—platforms’ exercise of editorial judgment—such that § 501.2041(2)(d) violates platforms’ First Amendment rights. Mila- vetz, 559 U.S. at 250. * * * It is substantially likely that S.B. 7072’s content-moderation restrictions (§§ 106.072(2), 501.2041(2)(b), (c), (f), (g), (h), (j)) and its requirement that platforms provide a thorough rationale for every content-moderation action (§ 501.2041(2)(d)) violate the First Amendment. The same is not true of the Act’s other disclosure provisions (§§ 106.072(4), 501.2041(2)(a), (c), (e)) and its user-data- access provision (§ 501.2041(2)(i))....
...Provision Fla. Stat. § Likely Disposi- Constitutionality tion Candidate 106.072(2) Unconstitutional Affirm deplatforming Posts by/about 501.2041(2)(h) Unconstitutional Affirm candidates “Journalistic 501.2041(2)(j) Unconstitutional Affirm enterprises” Consistency 501.2041(2)(b) Unconstitutional Affirm 30-day restriction 501.2041(2)(c) Unconstitutional Affirm User opt-out 501.2041(2)(f),(g) Unconstitutional Affirm Explanations 501.2041(2)(d) Unconstitutional Affirm (per decision) Standards 501.2041(2)(a) Constitutional Vacate Rule changes 501.2041(2)(c) Constitutional Vacate User view counts 501.2041(2)(e) Constitutional Vacate Candidate “free adver- 106.072(4) Constitutional Vacate tising” User-data access 501.2041(2)(i) Constitutional Vacate