Close Menu
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
Trending

Court Rejects Lawsuit Over Firing of Georgetown Administrator for Old “Hate for Zio Bitches” Posts

34 minutes ago

How Middle East journalists report during internet blackouts

42 minutes ago

Elon Musk’s X to deploy scam kill switch by auto-locking first-time crypto mentioners

47 minutes ago
Facebook X (Twitter) Instagram
Facebook X (Twitter) Discord Telegram
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Market Data Newsletter
Thursday, April 2
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Home»News»Media & Culture»The EU Killed Voluntary CSAM Scanning. West Virginia Is Trying To Compel It. Both Cause Problems.
Media & Culture

The EU Killed Voluntary CSAM Scanning. West Virginia Is Trying To Compel It. Both Cause Problems.

News RoomBy News Room19 hours agoNo Comments10 Mins Read1,588 Views
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram
The EU Killed Voluntary CSAM Scanning. West Virginia Is Trying To Compel It. Both Cause Problems.
Share
Facebook Twitter Pinterest Email Copy Link

Listen to the article

0:00
0:00

Key Takeaways

Playback Speed

Select a Voice

from the tricky-problems dept

Last week, the European Parliament voted to let a temporary exemption lapse that had allowed tech companies to scan their services for child sexual abuse material (CSAM) without running afoul of strict EU privacy regulations. Meanwhile, here in the US, West Virginia’s Attorney General continues to press forward with a lawsuit designed to force Apple to scan iCloud for CSAM, apparently oblivious to the fact that succeeding would hand defense attorneys the best gift they’ve ever received.

Two different jurisdictions. Two diametrically opposed approaches, both claiming to protect children, and both making it harder to actually do so.

I’ll be generous and assume people pushing both of these views genuinely think they’re doing what’s best for children. This is a genuinely complex topic with real, painful tradeoffs, and reasonable people can weigh them differently. What’s frustrating is watching policymakers on both sides of the Atlantic charge forward with approaches that seem driven more by vibes than by any serious engagement with how the current system actually works — or why it was built the way it was.

The European Parliament just voted against extending a temporary regulation that had exempted tech platforms from GDPR-style privacy rules when they voluntarily scanned for CSAM. This exemption had been in place (and repeatedly extended) for years while Parliament tried to negotiate a permanent framework. Those negotiations have been going on since November 2023 without resolution, and on Thursday MEPs decided they were done extending the stopgap.

To be clear, Parliament didn’t pass a law banning CSAM scanning. Companies can still technically scan if they want to. But without the exemption, they’re now exposed to massive privacy liability under EU law for doing so. Scanning private messages and stored content to look for CSAM is, after all, mass surveillance — and European privacy law treats mass surveillance seriously (which, in most cases, it should!). So the practical effect is a chilling one: companies that were voluntarily scanning now face significant legal risk if they continue.

The digital rights organization eDRI framed the issue in stark terms:

“This is actually just enabling big tech companies to scan all of our private messages, our most intimate details, all our private chats so it constitutes a really, really serious interference with our right to privacy. It’s not targeted against people that are suspected of child abuse — It’s just targeting everyone, potentially all of the time.”

And that argument is compelling. Hash-matching systems that compare uploaded images against databases of known CSAM are more targeted than, say, keyword scanning of every message, but they still fundamentally involve examining every unencrypted piece of content that passes through the system. When eDRI says it targets “everyone, potentially all of the time,” that’s an accurate description of how the technology works.

But… the technology also works to find and catch CSAM. Europol’s executive director, Catherine De Bolle, pointed to concrete numbers:

Last year alone, Europol processed around 1.1 million of so-called CyberTips, originating from the National Center for Missing & Exploited Children (NCMEC), of relevance to 24 European countries. CyberTips contain multiple entities (files, videos, photos etc.) supporting criminal investigation efforts into child sexual abuse online.

If the current legal basis for voluntary detection by online platforms were to be removed, this is expected to result in a serious reduction of CyberTip referrals. This would undermine the capability to detect relevant investigative leads on CSAM, which in turn will severely impair the EU’s security interests of identifying victims and safeguarding children.

The companies that have been doing this scanning — Google, Microsoft, Meta, Snapchat, TikTok — released a joint statement saying they are “deeply concerned” and warning that the lapse will leave “children across Europe and around the world with fewer protections than they had before.”

So the EU’s privacy advocates aren’t wrong about the surveillance problem. Europol isn’t wrong about the child safety consequences. Both things are true — which is what makes this genuinely tricky rather than a case of one side being obviously right.

Now flip to the United States, where the problem is precisely inverted.

In the US, the existing system has been carefully constructed around a single, critical principle: companies voluntarily choose to scan for CSAM, and when they find it, they’re legally required to report it to NCMEC. The word “voluntarily” is doing enormous load-bearing work in that sentence — and most of the people currently shouting about CSAM don’t seem to know it. As Stanford’s Riana Pfefferkorn explained in detail on Techdirt when a private class action lawsuit against Apple tried to compel CSAM scanning:

While the Fourth Amendment applies only to the government and not to private actors, the government can’t use a private actor to carry out a search it couldn’t constitutionally do itself. If the government compels or pressures a private actor to search, or the private actor searches primarily to serve the government’s interests rather than its own, then the private actor counts as a government agent for purposes of the search, which must then abide by the Fourth Amendment, otherwise the remedy is exclusion.

If the government – legislative, executive, or judiciary – forces a cloud storage provider to scan users’ files for CSAM, that makes the provider a government agent, meaning the scans require a warrant, which a cloud services company has no power to get, making those scans unconstitutional searches. Any CSAM they find (plus any other downstream evidence stemming from the initial unlawful scan) will probably get excluded, but it’s hard to convict people for CSAM without using the CSAM as evidence, making acquittals likelier. Which defeats the purpose of compelling the scans in the first place.

In the US, if the government forces Apple to scan, that makes Apple a government agent. Government agents need warrants. Apple can’t get warrants. So the scans are unconstitutional. So the evidence gets thrown out. So the predators walk free. All because someone thought “just make them scan!” was a simple solution to a complex problem.

Congress apparently understood this when it wrote the federal reporting statute — that’s why the law explicitly disclaims any requirement that providers proactively search for CSAM. The voluntariness of the scanning is what preserves its legal viability. Everyone involved in the actual work of combating CSAM — prosecutors, investigators, NCMEC, trust and safety teams — understands this and takes great care to preserve it.

Everyone, apparently, except the Attorney General of West Virginia. As we discussed recently, West Virginia just filed a lawsuit demanding that a court order Apple to “implement effective CSAM detection measures” on iCloud. The remedy West Virginia seeks — a court order compelling scanning — would spring the constitutional trap that everyone who actually works on this issue has been carefully avoiding for years.

As Pfefferkorn put it:

Any competent plaintiff’s counsel should have figured this out before filing a lawsuit asking a federal court to make Apple start scanning iCloud for CSAM, thereby making Apple a government agent, thereby turning the compelled iCloud scans into unconstitutional searches, thereby making it likelier for any iCloud user who gets caught to walk free, thereby shooting themselves in the foot, doing a disservice to their client, making the situation worse than the status quo, and causing a major setback in the fight for child safety online.

The reason nobody’s filed a lawsuit like this against Apple to date, despite years of complaints from left, right, and center about Apple’s ostensibly lackadaisical approach to CSAM detection in iCloud, isn’t because nobody’s thought of it before. It’s because they thought of it and they did their fucking legal research first. And then they backed away slowly from the computer, grateful to have narrowly avoided turning themselves into useful idiots for pedophiles.

The West Virginia complaint also treats Apple’s abandoned NeuralHash client-side scanning project as evidence that Apple could scan but simply chose not to. What it skips over is why the security community reacted so strongly to NeuralHash in the first place. Apple’s own director of user privacy and child safety laid out the problem:

Scanning every user’s privately stored iCloud content would in our estimation pose serious unintended consequences for our users… Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories. How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution? Tools of mass surveillance have widespread negative implications for freedom of speech and, by extension, democracy as a whole.

Once you create infrastructure capable of scanning every user’s private content for one category of material, you’ve created infrastructure capable of scanning for anything. The pipe doesn’t care what flows through it. Governments around the world — some of them not exactly champions of human rights — have a well-documented habit of demanding expanded use of existing surveillance capabilities. This connects directly to the perennial fights over end-to-end encryption backdoors, where the same argument applies: you cannot build a door that only the good guys can walk through.

And then there’s the scale problem. Even the best hash-matching systems can produce false positives, and at the scale of major platforms, even tiny error rates translate into enormous numbers of wrongly flagged users.

This is one of those frustrating stories where you can… kinda see all sides, and there’s no easy or obvious answer:

Scanning works, at least somewhat. 1.1 million CyberTips from Europol in a single year. Some number of children identified and rescued because platforms voluntarily detected CSAM and reported it. The system produces real results.

Scanning is mass surveillance. Every image, every message gets examined (algorithmically), not just those belonging to suspected offenders. The privacy intrusion is real, not hypothetical, and it falls on everyone.

Compelled scanning breaks prosecutions. In the US, the Fourth Amendment means that government-ordered scanning creates a get-out-of-jail card for the very predators everyone claims to be targeting. The voluntariness of the system is what makes it legally functional.

Scanning infrastructure is repurposable. A system built to detect CSAM can be retooled to detect political speech, religious content, or anything else. This concern is not paranoid; it’s an engineering reality.

False positives at scale are inevitable. Even highly accurate systems will flag innocent content when processing billions of items, and the consequences for wrongly accused individuals are severe.

People can and will weigh these tradeoffs differently, and that’s legitimate. The tension described in all this is real and doesn’t resolve neatly.

But what both the EU Parliament’s vote and West Virginia’s lawsuit share is an unwillingness to sit with that tension. The EU stripped legal cover from the voluntary system that was actually producing results, without having a workable replacement ready. West Virginia is trying to compel what must remain voluntary, apparently without bothering to read the constitutional case law that makes compelled scanning self-defeating. From opposite directions, both approaches attack the same fragile voluntary architecture that currently threads the needle between these competing interests.

The status quo in the United States — voluntary scanning, mandatory reporting, no government compulsion to search — is far from perfect. But the system functions: it produces leads, preserves prosecutorial viability, and does so precisely because it was designed by people who understood the tradeoffs and built accordingly.

It would be nice if more policymakers engaged with why the system works the way it does before trying to blow it up from either direction. In tech policy, the loudest voices in the room are rarely the ones who’ve done the reading.

Filed Under: 4th amendment, csam, csam scanning, eu, privacy, scanning, surveillance

Read the full article here

Fact Checker

Verify the accuracy of this article using AI-powered analysis and real-time sources.

Get Your Fact Check Report

Enter your email to receive detailed fact-checking analysis

5 free reports remaining

Continue with Full Access

You've used your 5 free reports. Sign up for unlimited access!

Already have an account? Sign in here

#DigitalCulture #DigitalMedia #FutureOfMedia #TechIndustry #TechNews #Technology
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link
News Room
  • Website
  • Facebook
  • X (Twitter)
  • Instagram
  • LinkedIn

The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.

Related Articles

Media & Culture

Court Rejects Lawsuit Over Firing of Georgetown Administrator for Old “Hate for Zio Bitches” Posts

34 minutes ago
Cryptocurrency & Free Speech Finance

CFTC Settles With Former FTX Engineer for $3.7M, Closing First Individual Case

53 minutes ago
Media & Culture

Florida Cop Punished With ‘Training/Counseling’ for Farting in Colleague’s Face

2 hours ago
Cryptocurrency & Free Speech Finance

US Attorney’s Office Recovers $600K in Crypto Fraud Scheme Targeting Ledger Wallet Owners

2 hours ago
Media & Culture

Gorsuch, Barrett, and Roberts Raise Fatal Objections to Trump’s Birthright Citizenship Order

3 hours ago
Media & Culture

WSJ: Lobbyists Easily Destroyed Any Semi-Serious Antitrust Enforcers Left In MAGA

4 hours ago
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

How Middle East journalists report during internet blackouts

42 minutes ago

Elon Musk’s X to deploy scam kill switch by auto-locking first-time crypto mentioners

47 minutes ago

DeFi Is Optimizing For gas, Not For Markets

50 minutes ago

CFTC Settles With Former FTX Engineer for $3.7M, Closing First Individual Case

53 minutes ago
Latest Posts

Florida Cop Punished With ‘Training/Counseling’ for Farting in Colleague’s Face

2 hours ago

Elliptic flags $285 million Drift exploit as a likely North Korea-linked operation

2 hours ago

Frane’s Lise To Host Fully Onchain IPO With ST Group Listing

2 hours ago

Subscribe to News

Get the latest news and updates directly to your inbox.

At FSNN – Free Speech News Network, we deliver unfiltered reporting and in-depth analysis on the stories that matter most. From breaking headlines to global perspectives, our mission is to keep you informed, empowered, and connected.

FSNN.net is owned and operated by GlobalBoost Media
, an independent media organization dedicated to advancing transparency, free expression, and factual journalism across the digital landscape.

Facebook X (Twitter) Discord Telegram
Latest News

Court Rejects Lawsuit Over Firing of Georgetown Administrator for Old “Hate for Zio Bitches” Posts

34 minutes ago

How Middle East journalists report during internet blackouts

42 minutes ago

Elon Musk’s X to deploy scam kill switch by auto-locking first-time crypto mentioners

47 minutes ago

Subscribe to Updates

Get the latest news and updates directly to your inbox.

© 2026 GlobalBoost Media. All Rights Reserved.
  • Privacy Policy
  • Terms of Service
  • Our Authors
  • Contact

Type above and press Enter to search. Press Esc to cancel.

🍪

Cookies

We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.

Cookie Preferences

Manage Cookies

Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.

Your permission applies to the following domains:

  • https://fsnn.net
Necessary
Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.
Statistic
Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.
Preferences
Preference cookies enable a website to remember information that changes the way the website behaves or looks, like your preferred language or the region that you are in.
Marketing
Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.