Listen to the article
from the maybe-read-your-laws-before-passing-them dept
Back in 2023, Arkansas passed a social media age verification law so poorly drafted that the bill’s own sponsor couldn’t accurately describe who it covered. The law appeared to exempt TikTok, Snapchat, and YouTube while the sponsor publicly claimed those were the exact platforms being targeted. When the state’s own expert witness testified that Snapchat was covered, the state’s own attorney disagreed with his own witness in the same hearing. That law was struck down on First Amendment and vagueness grounds, and then permanently enjoined earlier this year in a suit brought by the trade group NetChoice.
So Arkansas went back to the drawing board and passed Act 900, which was supposed to fix all the problems with the original. Judge Timothy Brooks of the Western District of Arkansas has now preliminarily enjoined that law too, in a ruling that reads like a patient teacher explaining to a student why the homework still doesn’t work despite a rewrite.
The legislature did manage to fix the content-based definition problem that sank the first law, but the progress stops there. Act 900 imposes four main new requirements on social media platforms: a prohibition on “addictive practices,” default settings for minors (including a nighttime notification blackout), privacy default settings at the most protective level, and a parental dashboard requirement. Every single one of these provisions fell apart on review, each in its own special way.
The “addictive practices” provision might be the most impressively broken. Here’s what it actually says platforms must do:
Consistent with contemporary understanding of addiction, compulsory behavior, and child cognitive development, ensure that the social media platform does not engage in practices to evoke any addiction or compulsive behaviors in an Arkansas user who is a minor, including without limitation through notifications, recommended content, artificial sense of accomplishment, or engagement with online bots that appear human.
“Contemporary understanding of addiction” is doing a lot of work here, and it’s not up to the job. There is no consensus that social media constitutes addiction in any clinical sense. So it’s entirely unclear what a company would need to do here, which is fatal in a First Amendment context. And yet, the law is designed such that violations are strict liability and ridiculously broad. A plain reading of the law shows that it is not limited to addiction to the platform itself; a platform can apparently be held liable if its practices “evoke” addiction to off-platform activities. And the statute uses the singular “user,” meaning a single child’s response triggers liability.
As the court puts it:
Not only does Act 900 impose liability based on a single child’s response to the platform, it does so on a strict liability basis—a platform is liable for a practice the evokes addiction in a single child even if it could not have known through the exercise of reasonable care that the practice would have such an effect. “Businesses of ordinary intelligence cannot reliably determine what compliance requires.”
The state, realizing belatedly that it had written an unworkable law, asked the court to just sort of ignore the strict liability language and read in a specific intent requirement that doesn’t exist anywhere in the text. As the judge notes, that’s not how any of this works. The courts interpret the law as written and are not there to fix the legislature’s mistakes:
Instead of defending the statute the General Assembly enacted, Defendants ask the Court to rewrite it by ignoring the strict liability provision altogether and inserting a specific intent requirement that appears nowhere in the text. The Court cannot do so.
Then there’s the default provisions. The court was actually somewhat sympathetic to the idea that the state has a legitimate interest in helping kids sleep. The problem is that the law itself undermines that interest by letting parents flip the nighttime notification blackout off. And the government is not there to fix what parents refuse to do:
While Defendants justify the notification default as an aid to parental authority, they ignore their own evidence that parents are part of the problem. If parents wanted to prevent their children’s sleep from being disrupted by late-night notifications, they have a readily available, free, no-tech solution already at their disposal: taking devices away at night. Yet “86% of adolescents sleep with their phone in the bedroom.” …. The State has provided no evidence that parents lack the tools to assert their authority in this domain, so it appears unlikely that the State’s deferential approach to restricting nighttime notifications will actually serve its stated interest in ensuring minors get enough sleep. This “is not how one addresses a serious social problem.”
The privacy default is worse. It requires platforms to set privacy controls to their most restrictive level for minors — but says nothing about who can change them. Meaning, as the court notes, the minor can just… change them. The state argued this was necessary to protect children from sexual exploitation online. The court points out the obvious problem:
On the other hand, because the default can be changed by the minor, this provision is also wildly underinclusive. Defendants say children need this law to protect them from sexual exploitation online. But the law, in effect, allows children to decide whether they need protection from sexual exploitation online because they are free to depart from the protective default. As Defendants’ evidence shows, teenagers’ developing brains make them less likely than adults to appreciate the risks associated with, for example, making their profiles public… Like the notification default, while the burdens imposed by the privacy default may be slight, they do not appear likely to serve the State’s asserted interest at all. Imposing small burdens on vast quantities of speech for no appreciable benefit is not consistent with the First Amendment. Arkansas cannot sentence speech on the internet to death by a thousand cuts.
Any law that burdens First Amendment speech has to be tailored precisely to a compelling goal. And if it’s either under or over-inclusive, it’s going to have problems surviving. Making it such that kids could just turn off the privacy controls fails that test.
But the dashboard provision is where things get genuinely hilarious, in that dark way where you wonder if anyone read the bill before voting on it. Act 900 has three separate definitions for people who interact with platforms: “account holders,” “users,” and “Arkansas users.” The problem is that, according to the statute’s own definitions, a “user” is specifically someone who is not an account holder — in other words, just a visitor to the site who doesn’t have an account. Yes, it’s confusing. The court is confused. Everyone is confused.
Act 900 has one particularly noteworthy problem: “users.” Act 900 has three different definitions for relationships a person can have with a platform. First, an “account holder” is “an individual who primarily uses, manages, or otherwise controls an account or a profile to use a social media platform.” Id. sec. 1, § 4-88-1401(1). “Account holder” is not used in any of the Act’s operative provisions. Second, a “user” is “a person who has access to view all or some of the posts and content on a social media platform but is not an account holder.” Id. § 1401(12). Third, an “Arkansas user” is “an individual who is a resident of the State of Arkansas and who accesses or attempts to access a social media platform while present in this state.” Id. § 1401(2). “Arkansas users” include both “account holders” and “users,” but “users” are definitionally not “account holders.” The addictive practices provision and the default provisions therefore apply to all Arkansas minors, whether they have a social media account or are merely a website visitor. Worse, the dashboard provision applies only to minor “users,” not account holders.
Again: the dashboard provision requires platforms to build parental supervision tools for minor “users.”
Not account holders. Users. Which, as the court notes, definitionally does not include “account holders.” Meaning it only applies to… random anonymous visitors to the website. Those who have accounts… apparently aren’t covered?
As the court explains, taking the statute at its word would require platforms to:
(1) collect age information from everyone who visits a covered platform to identify minors; and (2) collect and store identity information for every minor who visits a platform to track their “use habits,” connect them with their parents, and effectuate “tools for a parent to restrict his or her minor child’s access.”
This is a law that claims to be about children’s privacy that accidentally requires mass surveillance and identity collection on every anonymous visitor to a website, just in case one of them turns out to be an Arkansas minor. The court openly “questions whether this was the General Assembly’s intended result” but notes it can’t just rewrite the statute because the legislature picked the wrong word. That’s on them. Just like the earlier provision that the state asked the court to quietly rewrite.
The Arkansas legislature does not appear to be a detail-oriented body.
Oh, and there’s also an audit requirement directing platforms to conduct quarterly audits to ensure their products aren’t “causing minors to engage in compulsory or addiction-driven behavior” — again, including off-platform behavior, apparently. How a platform is supposed to audit for behaviors that happen when users aren’t on the platform is left as an exercise for the reader.
What makes this all so maddening is that none of these problems are subtle. The “user” vs. “account holder” mixup is the kind of thing that any lawyer should catch on a close read. The strict liability plus singular “user” combination in the addictive practices provision is exactly the drafting error that made the 2023 law fail. The defaults that can be changed by the very minor they’re supposed to protect — that’s not a hard problem to spot.
There is a reason this pattern keeps repeating.
Passing an unconstitutional law to “protect the kids” from Big Tech generates headlines, press conferences, and signing ceremonies. Governor Sarah Huckabee Sanders got to tweet about how “social media companies have gotten away with exploiting kids for profit” when she signed the original law. That made the news. The permanent injunction three years later, overturning that same law? Barely a ripple. Act 900 itself got its own round of celebratory press. The injunction we’re discussing here will get a fraction of that coverage.
The political asymmetry is kind of the point. State legislatures have figured out that there is essentially no downside to passing obviously unconstitutional social media laws. The upside is maximal: you get to posture as tough on Big Tech, protective of children, and responsive to moral panics about screens and teens. The downside — losing in federal court, wasting state resources on legal fees, and getting lectured by judges about basic First Amendment doctrine — happens quietly, years later, long after the political benefits have been banked.
Arkansas will almost certainly lose its appeal, and either way the legislature will be back next session with a new hastily drafted law that fixes some of Act 900’s problems while introducing fresh ones. And then that will get struck down. And then they’ll try again. Texas, Florida, California, Ohio, Utah, Mississippi, Tennessee, Georgia, and a growing list of other states are running the same play on roughly the same schedule.
The courts keep doing their jobs. NetChoice keeps winning. Judges keep writing careful opinions explaining, for what feels like the hundredth time, that strict scrutiny means what it means, vagueness doctrine exists for a reason, and you cannot simply compel platforms to do whatever you want because you have invoked The Children.
None of it matters to the incentive structure. The headline from the signing ceremony is worth more than the opinion from the courthouse. Until that changes — until voters start holding legislators accountable for passing laws that can’t survive even the most basic constitutional review — we’re going to keep reading rulings like this one. Arkansas just provided the latest installment. There will be more.
Filed Under: 1st amendment, arkansas, free speech, privacy, protect the children, social media, social media addiction, social media safety act
Companies: netchoice
Read the full article here
Fact Checker
Verify the accuracy of this article using AI-powered analysis and real-time sources.

