|
|
Newsletter #11—February 2026
|
|
|
|
|
Lies, Damned Lies, and CSAM Statistics
For a subject that animates so much Internet regulation, you might think that we'd know how much child sexual abuse material (CSAM) is actually circulating on the Internet. But CSAM isn't directly measurable in the same way as other types of Internet content, because it is illegal to possess or distribute. Therefore the only statistics that are collected about its prevalence come from law enforcement bodies that are lawfully entitled to handle it. And those statistics are frequently misleading or flat-out wrong.
|
Most recently, as reported by Bloomberg and further unpacked by Riana Pfefferkorn, it was revealed that a series of reports about the Internet being awash with AI-generated images and videos that resembled CSAM had been entirely fictitious, in that they were based on the misinterpretation of a figure that Amazon reported to the National Center for Missing and Exploited Children (NCMEC), the U.S.-based CSAM reporting agency. In fact zero of the 380,000 images that Amazon had reported had been AI-generated, and many (we don't know how many) were not CSAM either.
|
If this seems worryingly inexact, you will not be any more comforted to know that there is no measure of what proportion of the overall CSAM statistics reported by NCMEC – in 2024, about 20.5 million reports – amount to duplicate reports of the same, smaller number of individual files. However, we do get a window into what the proportion might be from a blog post that Facebook posted in 2021, revealing that about 90% of those reports are duplicates. In fact, almost half of its reports related to just six videos which went viral – apparently mostly shared out of either misplaced humor or outrage, rather than out of a sexual interest in children.
|
We get another insight into the reliability of CSAM statistics from a 2019 report from Swiss authorities that 90% of the images they receive from NCMEC are later found to be lawful. Part of the problem is that there is a built-in incentive for platforms to be over-inclusive with what they report. Under U.S. law, electronic service providers are required to report “apparent” violations, and they face significant legal, regulatory, and reputational risk if they are seen to have under-reported. There is little comparable downside to over-reporting. As a result, platforms tend to err on the side of forwarding anything that could plausibly fall within the reporting obligation, including borderline age determinations and low-confidence automated matches.
|
Until recently, it seems that this incentive structure might be subject to change, with a civil lawsuit brought by William Lawshe against Verizon and a cloud service provider for reporting him to authorities over supposed CSAM content that turned out to be 18+ erotic images. When we blogged about that case in January, it appeared that platforms could become liable if they reported someone over "unconfirmed CSAM" that later turned out to be legal content. However, the case was since settled, under terms that would absolve Verizon from liability for over-reporting lawful content going forward.
|
One tidbit that was confirmed in the Lawshe case was that platforms such as Verizon aren't relying on the verified database of known CSAM that NCMEC itself compiles, but on less accurate databases that are cobbled together from unconfirmed reports that platforms themselves make, or that come from child safety interest groups. One of these sources, the Canadian Center for Child Protection, has included photographs of unclothed indigenous people, and even a frame from the children’s movie Pippi Longstocking, in its reports.
|
Where does this leave us, when assessing the actual volume of real CSAM that animates so much regulatory interest? It leaves us with scary sounding numbers that actually mean very little. We don't know how many unique files exist. We don't know how many depict real children. We don't know how many are newly created. We don't know how many are lawful but misclassified. And we have no idea how many involve hands-on abuse that could have been prevented by a different regulatory approach.
|
None of this is to deny that child sexual abuse is real, devastating, and deserving of serious intervention. But policymaking driven by inflated, ambiguous, or poorly contextualized metrics risks distorting both public understanding and legislative response. When report volume is treated as equivalent to victimization volume, every increase in detection technology, every change in reporting format, and every viral incident can be misread as proof of exponential growth in abuse itself.
|
If we are going to regulate speech, mandate surveillance, weaken encryption, or impose sweeping liability regimes in the name of combating CSAM, we should at minimum insist on statistical clarity. That means distinguishing between reports and confirmed material, between duplicates and unique files, between automated flags and human review, and between lawful but controversial content and criminal abuse.
|
|
Without that clarity, we are not regulating in response to evidence. We are regulating in response to numbers that sound terrifying — but that we do not actually understand.
|
|
|
|
|
|
|
|
|
The Latest from Our Blog
|
|
|
Deepfakes, Fiction, and the Future of CSAM Law
|
|
|
|
On February 10, 2026, Australian author Lauren Ashley Matrosa was convicted of possessing and distributing child abuse material. However, no children were involved or harmed. The conviction was based on an erotic novel that she had written, in which the only sexual activity takes place between adults play-acting a "Daddy Dom/little girl" (DD/lg) fantasy. Nevertheless, the judge ruled that Australia's …
|
|
|
|
|
|
|
|
Living under the ICE age - How the agency is surveilling people and what you can do to prevent it
|
|
|
|
Since President Trump was reelected, Immigration and Customs Enforcement has gone on a rampage to detain and deport people in our communities. ICE has come under massive scrutiny for confronting and detaining undocumented with no criminal history, legal immigrants, or U.S. citizens in manners that are violent and dangerous. Especially after the murders of Renee Nicole Good and Alex Pretti, …
|
|
|
|
|
|
|
|
Beyond the Filter: Tech-Facilitated Gender-Based Violence
|
|
|
|
In this episode of Beyond the Filter, Brandy and Jeremy examine how gender-based harm is being amplified by digital platforms and AI tools, using the recent Grok scandal on X as a case study. The chatbot’s ability to generate sexualized, non-consensual images of real women and girls exposed serious failures in platform governance and safeguards beyond-the-filter-ep-11. Joining the discussion is Sofia Bonilla, Strategy …
|
|
|
|
|
|
|
|
|
|
The Elephant in the Lobby
|
|
|
This is the first in a new series of commentary articles from an independent contributor – beginning in this first instalment with a look at Jeffrey Epstein and anti-porn advocacy.
|
Jeffrey Epstein used his considerable wealth, power, and influence to protect his clients from accountability and facing the justice they rightfully deserved. The numerous victims who have come forward still wait for the day they can attain the closure and justice they seek. The files and their contents are being released as a reluctant snail's pace, and as they are, a profound sense of betrayal, hurt, confusion, and even triumph by those who knew all along is understandable. But what prompted the writing of this article were two things: the links between Epstein himself and 4chan, and what he sent to his associates.
|
4chan needs no introduction, being a hive for the most abhorrent material and opinions that no one should be unfortunate to witness. I harbor no sympathy for the users of the site. That has not changed when it was revealed M00t had allegedly made contact with Epstein, who is said to have had a major hand in shaping /pol/, fascism, and the current hard-right extremist movement. There are also allegations that Something Awful's ban of hentai on its forum led to the creation of more extremist-friendly hubs like 4chan, 8chan and /pol/.
|
For those who don't know, /pol/ is the more political section of 4chan that was created allegedly in response to Epstein himself requesting M00t to make one. I am not a journalist so I can't say for certain that this is the real reason. But based on some negative reactions on 4chan post-recent Epstein reveal, it was a surprise to them that they were being used for a far-right coup/conspiracy. You can guess at what /pol/ was incubating, and the result, but /pol/ aided in the rise of Trump, many far-right 'superstars' like Milo Yiannapolis, Nick Fuentes and many other 'Neo-Nazi edgelords'. There's a long, long rabbit hole that goes into Gamergate, 8chan and Kiwi Farms, but /pol/ was co-opted by the right wing in order to better set themselves and their agenda up.
|
The hatred of pornography and fiction is not an accidental or coincidental thing, as they use coded language to signal in and outgroups. Nazis during their takeover would take a hardline anti-pornography stance, saying it promoted 'degeneracy'. Current members of/pol/ likewise use this now to call certain fandom communities 'and sex workers degenerates' or 'degens', with some believing that pornography makes men 'weak' or 'emasculated'. Again, the rabbit hole of the extreme right is deep. /pol/ also contributed to the Pizzagate theory with Qanon, which was the conspiracy theory that Hillary Clinton and others were hiding a child sex trafficking operation at a pizza shop. And that's just one example of how they operate.
|
However, along with these revelations, came the boilerplate comments and all-around bashing of the fictional material he shared with associates of his: hentai, fanfiction, fanart and other forms of media that weren't CSAM. The fact that he distributed with the intent to groom is irrelevant to antiporn activists. It frustrates me, as a victim of sexual assault, to see the focus put on Adventure Time pictures, Five Nights At Freddy's, and stories, or on animated pornography, than on the fact this man hurt hundreds, if not thousands of vulnerable children, women and currently, our very country with his actions.
|
|
This is not to say that Epstein has not distributed CSAM or non-consensual material featuring his victims. It's that this infantile discussion about the material itself is puerile and disgraceful, and more than that, the anti-pornography lobby delights in lines being blurred because it distracts from the true crimes and allows them to make bad faith arguments that consuming fiction or watching pornography made by consenting adults is indicative of your character. Bill Cosby certainly didn't watch any pornography but he was revealed to have sexually abused several girls. Religious figures proselytize about purity and morality. I venture they still have a higher sexual abuse rate than most fan conventions.
|
Distribution with intent to groom, sexual abuse, trafficking, aiding and abetting abusers, and the decline of democracy by encouraging right-wingers to be more extreme should be the takeaways, not giggling over fictional material and demonizing pornography.
|
|
|
|
|
|
|
Excerpt from the COSL 2026 Annual Report
|
Safety and liberty online should not be at odds with each other. When speech is
|
|
indiscriminately censored and access is curtailed, online spaces for discussion,
|
|
creativity, support, and protest either disappear or are drained of their diversity
|
|
and vitality. At the same time, online spaces marked by gender-based violence, hate
|
|
speech, or targeted harassment are not spaces in which marginalized individuals are
|
genuinely free to participate.
|
|
Too often, online policy debates treat safety and freedom as a zero-sum choice: more
|
|
safety is assumed to require less liberty, and more liberty is assumed to tolerate harm.
|
|
This framing is both false and damaging. It obscures the ways in which overbroad
|
|
censorship can undermine safety, while allowing real abuse to persist unaddressed. It
|
also marginalizes the very communities most often invoked in the name of protection.
|
|
The Center for Online Safety and Liberty (COSL) was launched in 2025 to challenge this
|
|
false binary. COSL is founded on the principle that safety and liberty are not competing
|
|
values, but mutually reinforcing ones—and that durable, inclusive online spaces require
|
|
both. We confront both governments and corporations that perpetuate the false binary
|
|
between safety and liberty, and we pursue change through research, permissionless
|
innovation, and grassroots action.
|
|
During 2025, COSL launched ten ambitious projects that promote safety and liberty as
|
|
parallel values, clustered within four Priority Areas: Safer Hosting, Supporting Fans,
|
|
Cyberbullying and Abuse, and Legal Advocacy. While most of these initial projects are
|
|
managed in-house, COSL also offers fiscal sponsorship to vetted independent projects
|
that advance our mission and fit within one of our Priority Areas.
|
|
COSL ended 2025 with a small team of seven, with oversight from a compact startup
|
|
Board of four. Despite these limited human resources, we published over twenty blog
|
|
posts, over 400 social media posts, and eight podcast episodes during the year, placed
|
|
two op-eds, and secured our first philanthropic grant—all in addition to our broader
|
|
programmatic work. Read on to discover more about our achievements in 2025 as well as
|
our vision for the organization’s future.
|
|
|
|
|
|
|
|
|
|
|
Join Us as an Activist
Once again COSL has an opening for an Activist to join our team to help drive strategic, high-impact advocacy initiatives. In this role, you’ll work closely with COSL’s management team to identify timely opportunities, design persuasive campaigns, and bring them to life through social media, open letters, policy submissions, petitions, and direct engagement. You’ll meet virtually with policymakers, influencers, and coalition partners, build relationships across borders, and help ensure COSL’s voice is heard in fast-moving debates around online safety, censorship, and digital rights.
|
This is a part-time, fully remote volunteer role (approximately 5–10 hours per week) ideal for someone who is deeply motivated by digital-rights issues and eager to turn ideas into action. You’ll gain hands-on experience in advocacy strategy, lobbying, and campaign execution, while connecting with a global network of civil-liberties and technology-policy professionals. If you’re passionate about defending online freedom — and want to help shape the conversations that matter — we warmly invite you to apply.
|
|
|
|
|
|
|
|
|
|
Support our work!
Pledging your monthly support for our work is the best way that you can support us, because it gives us the stability to plan ahead. You can pledge your support at three levels.
|
|
|
|
|
|
|
|
|
|
Keep In Touch With Us
|
|
|
|
|
|
|
|
|