Newsletter #8November 2025
Health-Mag-Title-2

Seeing Clearly: Why Precision Protects Children in the Age of AI

Around the world, regulators and technology platforms are under growing pressure to detect and remove child sexual abuse material (CSAM). The rise of generative AI has made that challenge even more complex, expanding not only what can be created, but what must now be distinguished: what’s real, what’s synthetic, and what’s simply artistic or fictional expression.

At the Center for Online Safety & Liberty, we believe protecting children and protecting rights are not competing goals. Both depend on one thing that is too often overlooked: precision.

When Definitions Blur, Children Pay the Price

In Europe, recent proposals to mandate device-level scanning have been described by experts as a threat to free expression and privacy. These measures would treat real child sexual abuse and fictional or AI-generated imagery as equivalent categories of harm. Yet research from COSL’s Drawing the Line Between Personal Expression and Lived Abuse initiative shows that when detection systems fail to distinguish between real and synthetic material, the people most in danger become harder to find.
The intent behind such policies is to strengthen protection for children. But when definitions collapse, the result is a system that grows busier — not more effective.

Volume Isn’t Victory

In 2024, the National Center for Missing & Exploited Children’s CyberTipline received more than 29 million separate incident reports. Over 8 percent contained too little information to even determine jurisdiction. Despite record reporting levels, the number of children identified and rescued has not kept pace.
When enforcement frameworks treat fictional, artistic, or synthetic material as equivalent to real abuse, the reporting volume expands, but the precision required to locate and protect real children erodes. In short, the system becomes overwhelmed by noise.

When Privacy Is Treated as the Enemy

The pressure to weaken privacy protections has intensified in recent years. Signal, for example, has said it would rather leave the UK than expose encrypted messages to government scanning. If major platforms exit jurisdictions or respond by mass-suppressing content to avoid liability, the result is not fewer cases of abuse — it’s fewer opportunities to detect and stop real exploitation.
The trend extends beyond Europe. The UK’s Online Safety Act has normalized pre-emptive content restriction; Australia’s age-based enforcement model has led to sweeping platform controls for minors; and in the United States, age-verification mandates are now facing constitutional challenges. Each approach expands enforcement faster than legal clarity, driving systems toward over-censorship of lawful expression.
As a result, many users simply go around these systems. VPN usage has surged in countries implementing aggressive filtering, and Tor browser activity is at its highest levels in years — making it even harder for investigators to trace genuine abuse.

The Human Cost of Misplaced Enforcement

Expanding surveillance of anonymity tools may seem like an easy fix, but it undermines the very protections survivors, journalists, and at-risk communities rely on. Weakening privacy doesn’t stop harm; rather, it drives it further underground.

Meanwhile, prevention and rehabilitation programs remain chronically underfunded. Studies show that nearly half of individuals seeking out illegal material online are trying to access help to stop, yet three-quarters cannot access it. At the same time, resources have been redirected toward prosecuting works of fiction or artwork where no child was harmed.

These misplaced priorities don’t protect children. They spread enforcement thin and divert attention from real victims who need rescue and recovery.

Precision Is Protection

Our research at COSL confirms that many authorities do not distinguish between real and synthetic material in their enforcement data at all. When law and technology blur those boundaries, it becomes harder to find the children who are actually being harmed.
Criminal law must target real harm to real people.
  • Sexual abuse imagery involving real children requires an urgent criminal response.
  • Non-consensual synthetic sexual imagery targeting identifiable individuals demands civil and, in many cases, criminal remedies.
  • When no real child is depicted, the appropriate response lies in education, content governance, and prevention, ensuring that law-enforcement resources remain trained on actual exploitation.

Seeing Clearly

Protecting children requires more than urgency; it requires focus. When systems are flooded with noise, they lose sight of the very cases that matter most.

In the age of AI, law enforcement, policymakers, and platforms alike need clear definitions and accountable processes to separate real harm from representation. That clarity makes every response, from prevention to prosecution, more effective.

At COSL, we’re bringing together survivors, technologists, and advocates to help build systems that can do both: protect children and preserve rights. Because when our institutions see clearly, they can act decisively — and protect those who need it most.
Health-Mag-End
Health-Mag-Promo-Start

The Latest from Our Blog

Fan Refuge Launch Postponed and How You Can Help

One of COSL’s anticipated projects, Fan Refuge, a fandom community site built by fans, for fans, has been postponed. This decision was not made lightly. Our team is pivoting quickly with the changing legal landscape of online speech.There has been a growing wave of concerning new laws across the U.S, which we have been tracking and fighting against through our …
Fan Refuge Launch Postponed and How You Can Help

So you're living under fascism -- what do you do now?

So you're living under fascism -- what do you do now?
For those of us not living under a rock, and with the unfortunate ability of pattern recognition, it’s not hard to realize that if you’re living in the United States right now we are living under a fascist regime. You have the President of the United States ordering the Texas National Guard to invade another state (Illinois), and children and their parents …

Beyond the Filter: Fantasy Sexual Materials and Offending

Hosts Brandy and Jeremy talk with Dr. Craig Harper of Nottingham Trent University about his research on whether fantasy sexual materials—like AI-generated images, cartoons, or sex dolls—are linked to real-world offending. Dr. Harper explains that, despite common policy assumptions, there’s no evidence such materials increase risk. Across multiple studies, his team has found no connection between fictional sexual content and …
Beyond the Filter: Fantasy Sexual Materials and Offending
Health-Mag-End
Health-Mag-Promo-Start

Global Statement on the Role of Encryption in Securing Trust and Enabling the Digital Economy

On November 17, COSL joined with 60 other organizations to release the following statement on the role of encryption in securing trust and enabling the digital economy:

The undersigned believe that strong encryption is essential to the global digital economy. Encryption safeguards user privacy, protects sensitive data, and enables trust, which are foundations of commerce, communication, and innovation. Encryption is a vital tool for ensuring that consumers, businesses, and governments can confidently engage online, fostering a secure environment that supports economic growth and cross-border collaboration.

Any effort to undermine encryption, whether through backdoors, key escrow systems, or technical mandates, undermines that trust. Weakening encryption introduces systemic vulnerabilities that criminals and hostile actors can exploit, erodes consumer confidence, and drives users and businesses toward unsecure platforms. Further, inconsistent national approaches to encryption risk fragmenting the global digital economy, creating barriers to trade and interoperability across borders.

We recognise the legitimate needs of law enforcement and national security agencies to access evidence and combat crime. However, these goals must be pursued through lawful, proportionate, and technologically sound means that do not compromise the safety and privacy of billions of consumer and enterprise users. Policymakers should strengthen, not weaken, the tools that protect our shared digital infrastructure.

By endorsing this statement, we collectively call on governments around the globe to advance policies that protect encryption as a vital enabler of digital trust and economic prosperity. All stakeholders must stand together to ensure that strong encryption remains available to establish and maintain trust across the global digital economy.
Health-Mag-End
Health-Mag-Promo-Start

A Preview from the Drawing the Line Watchlist

    Next month the Center for Online Safety and Liberty (COSL) will be releasing its Drawing the Line Watchlist 2025, a major report covering 10 jurisdictions across six continents, that reveals how misplaced moral panic is diverting resources away from real child abuse crimes, towards obscenity prosecutions that target creators, fans, LGBTQ+ communities, and even children themselves.

    One of the most shocking revelations from the upcoming Watchlist is that in the United Kingdom, prosecutions over real child sexual abuse materials have actually fallen, while prosecutions over victimless art and fiction have ballooned to represent a staggering 40% of image prosecutions overall. Read more in the preview infographic below, prepared by COSL's own Activist, Valerie.
    Drawing the Line United Kingdom infographic
    Health-Mag-End
    Health-Mag-Promo-Start

    Technical Manager – Liberato Hosting

      At a time when censorship, surveillance, and platform crackdowns are becoming the norm, Liberato stands as a refuge for creators, activists, and communities who need a safe home for their digital expression. A project of the Center for Online Safety and Liberty (COSL), Liberato operates entirely on its own self-hosted infrastructure — independent of major cloud providers and built from the ground up to prioritize privacy, resilience, and user rights. With encrypted, offshore hosting; a commitment to hands-off logging practices; and a trust-and-safety approach rooted in civil liberties, Liberato is one of the few platforms truly dedicated to protecting expression without compromise.

      As we expand this mission, we’re looking for a Technical Manager (Volunteer) to join our small but capable team. This is a chance to help steward a sophisticated hosting environment powered by Proxmox, Ceph, CyberPanel, LiteSpeed, Docker, and WHMCS — all working together to deliver stable, censorship-resistant service to people who need it. You’ll collaborate closely with the project lead, oversee our virtualized infrastructure, implement upgrades and migrations, strengthen network-level security, and help ensure that our systems remain reliable under pressure. The work is meaningful, technically challenging, and directly aligned with digital-freedom values.

      This volunteer position is flexible (5–10 hours per week), fully remote, and open to applicants worldwide. We welcome someone with Linux administration experience, familiarity with virtualized or clustered environments, and a strong sense of responsibility toward user privacy and human rights. In return, you’ll gain hands-on experience managing a complex hosting stack, contribute to a vital online-freedom initiative, and — if you wish — earn commissions for new client referrals. If you care deeply about protecting expression online and want to help build the infrastructure that makes it possible, we’d love to hear from you.
      Health-Mag-End
      Health-Mag-Promo-Start

      Support our work!

      Pledging your monthly support for our work is the best way that you can support us, because it gives us the stability to plan ahead. You can pledge your support at three levels.

      Health-Mag-Promo-End
      Health-Mag-Promo-Start

      Keep In Touch With Us

      linkedin custom 
      Health-Mag-Promo-End
      Email Marketing Powered by MailPoet