|
|
Newsletter #12—March 2026
|
|
|
|
|
The EU's Deadlock on CSAM Scanning Isn't the End of the Fight
On March 16, European lawmakers reported reaching a deadlock over negotiations to extend a temporary derogation from European privacy law that permits Internet platforms to scan users' communications for child sexual abuse material (CSAM). The derogation had originally been passed in 2021 so that platforms could proactively detect and report CSAM while a permanent Child Sexual Abuse Regulation (CSAR) was negotiated. The measure, which was always meant as a short-term bridge, has already been extended once and is now due to lapse on 3 April 2026.
|
The collapse of extension talks stems from demands from the directly-elected European Parliament to align the derogation with its agenda for the final CSAR. Namely, platform scanning of communications should be strictly limited to previously identified (“known”) CSAM, plus reports of new material from users or trusted flaggers, and exclude the use of AI-based proactive scanning for unknown CSAM or grooming conversations, or scanning inside encrypted communications. The Council of the European Union, representing member states, refused these demands, arguing that such narrowing would render the interim measure ineffective.
|
What is often overlooked is that the original 2021 text was never as narrow as “known-only” in practice. It allowed “state-of-the-art” technologies for detecting both CSAM and solicitation of children, with safeguards such as data protection impact assessments and human review. Platforms (including Google, Meta and Yubo) relied on this flexibility to combine traditional hashing with AI classifiers that flagged unknown images and grooming patterns, generating thousands of reports annually.
|
Part of the problem is that these AI-based reports have become too extensive, resulting in too many innocent users being reported over innocent content, especially from Meta, which is responsible for the vast majority of reports overall. Even law enforcement agencies themselves have described the flood of AI-reported content coming from Meta as "junk". The Commission’s own implementation reports – particularly the latest one – also back this up, noting limited or inconclusive proof of the effectiveness and proportionality of AI tools for unknown CSAM and grooming detection, citing high error risks and modest real-world impact.
|
Indeed, even the effectiveness of the non-AI hashing technology used to scan for already-known images has been recently called into question. Newly published research shows that the dominant technology used for doing this hash-based scanning, Microsoft's PhotoDNA, is worryingly unfit for purpose, allowing CSAM distributors to easily circumvent the technology, while also allowing malicious actors to misuse it by causing false CSAM matches for innocent images.
|
For this reason, some are welcoming the failure of the temporary extension negotiations, arguing that a mass surveillance based approach to child protection is dangerous and outmoded. Instead, Internet companies should shift their focus shift towards making their platforms safer by design, while police should devote theirs towards targeting offenders discovered through open web scanning and other traditional investigative methods.
|
|
The collapse of the extension negotiations exposes a deeper tension at the heart of the debate: a growing gap between the evidence base for large-scale scanning and the momentum behind expanding it. While concerns about error rates, proportionality, and effectiveness remain unresolved, the policy drive toward broader detection capabilities continues. Understanding that disconnect, and the institutional and commercial forces that sustain it, will be critical as the EU moves forward with a permanent regulatory framework.
|
|
|
|
|
|
|
|
|
The Latest from Our Blog
|
|
|
Children can easily become targets of dangerous people online - how can you keep them safe?
|
|
|
|
The current release of documents connected to the Jeffrey Epstein case has revived public attention on child sexual exploitation. High-profile cases like Epstein’s can be horrifying, and they remind us that abuse can occur even inside the most elite social circles. Happening concurrently, these noteworthy cases can sometimes conceal a more common truth: children are most commonly jeopardized in everyday …
|
|
|
|
|
|
|
|
When Governments Criminalize Fiction, Real Victims Pay the Price
|
|
|
|
Recent proposals in the United Kingdom to criminalize certain forms of pornography illustrate a growing policy problem: the increasing willingness of governments to treat fictional depictions of sex as though they were evidence of real harm. Two examples currently under discussion in the UK make this shift especially clear. First, the government has proposed banning pornography that depicts strangulation or “breath play”, …
|
|
|
|
|
|
|
|
Beyond the Filter: Sexual Content Online: Who Gets Protected and Who Gets Policed?
|
|
|
|
In the season finale of Beyond the Filter, Jeremy and Brandy are joined by Shambhawi Paudel (ILGA Asia) and Mar Díaz (digital rights and LGBTQ advocate) for a wide-ranging conversation on censorship, platform power, and queer expression across Europe and Asia. From colonial-era morality laws and online entrapment in parts of Asia to shadowbanning, algorithmic bias, and the limits of the …
|
|
|
|
|
|
|
|
|
|
The Elephant in the Lobby
|
|
|
The Media, Stigma, and Sexual Expression: Part One, by Madeline Brooks
|
|
I would like to begin with why the name for this series is The Elephant in the Lobby. There's a larger, dominant set of problems with the anti-pornography lobby, along with its supporters and donors, that everyone seems keen to ignore in favor of censoring disfavored expression. Just like the figurative elephant in the room, it's becoming increasingly difficult for me and others to ignore.Especially when the actors at play here are all too happy to pretend they don't exist.
|
Note that this is not limited to anti-pornography groups. There are issues in anti-censorship advocacy groups that will also be addressed here, some that are commonalities with pro-censorship groups that also go largely unaddressed or aren't acknowledged, which bolster the boldness of censorship advocates.
|
One of these commonalities is the usage of stigma and stigmatizing words, phrases and attitudes that still see a wide amount of usage. This stigma has crept into public policy and has made a comfortable home in laws that criminalize certain types of 'taboo' expression. We have seen the results of this happening in countries like Australia, Italy, and even the United States. Drawing the Line, a project that tries to make policies actually make sense and prevent harm, would be a boon to these lawmakers and to victims of sexual assault, exploitation, sexual violence, CSA, CSAM, and other crimes that have detrimental lasting effects on those who have experienced them firsthand.
|
Speaking of which, let's talk about how victims of these crimes are stigmatized because of what they write, draw, create or talk about online. It's a double gut-punch when a member of fandom gets told they deserve to suffer from the impact of their assault, their abuse, and their exploitation because they had 'a problematic ship', or because they like fictional pornography, or because they are a furry. Not only is their pain minimized, it is weaponized by the anti or antis who dislike the subject matter being made by the person they're attacking. In this way, the member is victimized again. They are told that something is wrong with them, that they actively contribute to the exploitation of the vulnerable and innocent, that they need more 'healthy coping mechanisms.' As if the anti harassing them has a healthy sense of what's best for the other's mental health. "Let me harass someone else into being what I want them to be."
|
There are forums dedicated to putting down subsets of fandoms and niche communities, and they're widely celebrated. They may even be defended by fellow antis if they should happen to be discovered committing the very crimes they accuse others of committing. More egregious examples can be found in organizations with more pull and influence, but sadly they are memory-holed by antis at large. The United Nations decries fictional pornography and consensual adult content created by sex workers, but is still failing to adequately address sexual abuse committed by its Peacekeepers and staff. There are antis that get caught distributing illicit materials to minors, but they are still largely unpunished. Some groups of antis create 'rings' to gather materials to distribute to minors with the intent to groom, even going so far as to disguise this intent as 'helping them recognize dangerous materials' or 'protecting them'.
|
Even in some sex positive and progressive circles, this stigma is cruelly used as a bludgeon or as a way to present themselves to the wider world as a 'saner, gentler niche'. This othering technique splinters anti-censorship movements, creates wedges and walls, and effectively creates resentment from the excluded parties. One only needs to look at subcultures like furry and anime communities to see how this plays out. And in sex work and author communities, this is not uncommon, either. The message is clear: make us look good to the society and critics who wouldn't accept us even on a good day, or we'll toss you out. Taboos are not acceptable. Everything has to appeal to the mainstream political scene. Oh, you're into BDSM? You're setting women back 100 years, appealing to men with a sadistic sense of control and entitlement. You say it's empowering? To who, the misogynist? You must be brainwashed. Please seek a professional for help. If that message infuriated you upon reading it--especially if you have been told this several times--imagine how I grimaced actually writing that out. But write it out I must, in order to illustrate just how harmful this kind of stigmatizing speech really is.
|
|
And the news media doesn't help matters or even attempt to defuse the panics they help spread. They platform known bigots, anti-porn 'warriors', tearful parents who apparently want the entire world to pay for their child's safety with the price of liberty and privacy, and people who think characters clad in bikinis are more of a threat to women than people who think I should be reduced to a walking incubator and a living day-care and housemaid with no rights or autonomy. They know controversy generates revenue, it's why they allow Jonathan Haidt to promote his book about why social media is ruining the minds of kids for the umpteenth time. I would like to submit an argument in this same vein: News outlets create the same amount of problems as, if not moreso than social media and pornography.
|
|
|
|
This is in response to the very first article I wrote. I realize now that the information printed in it is not only dated but has been debunked by other reputable sources. For this, I would like to make several key corrections on a few things:
|
- Nothing has been confirmed to be true. All of the things listed in the article are allegations until proven to be true, and not everyone listed in the files was involved in criminal activity. Everything in the article is inconclusive and shouldn't be taken as truth until a full investigation has been made and convictions happen.
- /pol/ was not created after Epstein. In fact there is proof that it was in the works before the m00t and Epstein meeting allegedly took place.
Keep in mind that at the time I wrote the article I was going off of information that had been put out before the debunking of the post-Epstein file release allegations that 4chan and Epstein were linked.
|
As much as I criticize the press, the same standards should definitely apply these standards to myself also. I don't necessarily agree with 4chan, but that doesn't mean that I should go around spreading misinformation.
|
|
|
|
|
|
|
Liberato's Speed Upgrade
Liberato is COSL's non-profit hosting service for websites and virtual machines. Liberato powers all of COSL's online projects, but also welcomes other customers who are looking for an alternative to mainstream hosting providers.
|
In an age where major online platforms are becoming increasingly restrictive and governments are demanding that websites identify their users, Liberato refuses to play along. Its server cluster is hosted in a neutral jurisdiction outside of the USA or Europe, and does not rely on major cloud providers. What this means is that our content can't be censored, our users can't be tracked, and we answer only to our customers.
|
In the past month, Liberato has made some major improvements to its service. We have retired one of our old servers, and replaced it with a much newer and faster server with SSD storage – resulting in much faster access speeds. We'll be rolling out similar upgrades across the network as funding permits. We've also just introduced a new, streamlined method of provisioning virtual machines, which cuts the time from purchasing a machine to being able to log to just a few minutes.
|
If you are in need of website or virtual machine hosting, consider Liberato. We're on your side when it comes to ensuring your freedom of expression, privacy, and safety, and every dollar that you pay for our services goes towards supporting COSL and its projects
|
|
|
|
|
|
|
|
|
|
|
Campaign Organizer – Digital Rights & Online Privacy
We are seeking a part-time early-career digital rights advocate, policy student, or campaign organizer to help build a public interest campaign addressing a new wave of state-level online age-verification and age-assurance laws in the United States.
|
Recent legislation—including California’s California Digital Age Assurance Act (AB 1043) and Colorado’s Colorado SB26-051—would require operating systems, platforms, or online services to implement mechanisms that determine users’ ages before allowing access to certain online content or features.
|
While these laws are often framed as child-safety measures, they raise significant concerns about privacy, freedom of expression, anonymous speech, data security, and the future architecture of the open Internet.
|
This campaign will focus on ensuring that policymakers and the public understand the civil liberties implications of large-scale online age verification, while advocating for child safety approaches that protect both young people and fundamental rights.
|
This is a one-month internship position that carries stipend of $1000, with possible extension based on available resources. For more information and to apply, click the button below.
|
|
|
|
|
|
|
|
|
|
Support our work!
Pledging your monthly support for our work is the best way that you can support us, because it gives us the stability to plan ahead. You can pledge your support at three levels.
|
|
|
|
|
|
|
|
|
|
Keep In Touch With Us
|
|
|
|
|
|
|
|
|