This webinar held on 10 December 2025 was for the the global launch of the Drawing the Line Watchlist—a groundbreaking report examining how ten countries around the world are increasingly blurring the line between personal expression (art, fiction, advocacy, consensual adult material) and lived abuse involving real victims. Joined by three members of the project’s Advisory Board—Emma Shapiro, Ashley Remminga, and Zora Rush—this live podcast-webinar explores how censorship, moral panic, and poorly drafted laws are reshaping the digital landscape for artists, queer communities, and marginalized creators. Together, the panel unpacks:
- Why fictional content is being conflated with real sexual abuse, and how law enforcement resources are being redirected away from crimes with actual victims
- How artists, especially those depicting bodies or erotic themes, face uneven and often punitive moderation
- The impact of moral panics on queer and trans fandom spaces, and the historical roots of these controversies
- How AI systems struggle with context, nuance, and cultural bias—and what it means for sexual expression and safety online
- Examples of global overreach, including prosecutions of artists, writers, and even teenagers for fictional material
- Five key reforms governments should adopt to restore clarity, protect children, and uphold human rights
Guests share on-the-ground insights from their domains—arts advocacy, queer cultural research, and responsible AI—while Jeremy previews the Watchlist’s findings, including startling shifts in enforcement patterns and international case studies. The conversation closes with actionable recommendations for policymakers, platforms, and civil society about how to genuinely keep children safe without eroding creative and queer expression. For the full report and the Drawing the Line Principles, visit: drawingthelineprinciples.org.
Additional questions
These questions were asked and answered in the chat, which is not visible in the video.
Q: Does Drawing the Line have any idea why prosecutions for real child sexual abuse images have fallen so dramatically in the UK and perhaps elsewhere? Is it simply that policing the internet has been successful, so that such images have become much harder to access? If so, is there any evidence that the dark web is also being effectively policed?
A: It does seem to be a substitution effect, where obscenity prosecutions have become a proxy for real CSAM prosecutions. We may speculate (though it’s only that) that since this content is legal elsewhere in the world, people may post directly on clearweb social media, where they are easier pickings for police than dark web offenders who take more care to conceal their activities.
Q: “…prosecution for a novel, under child abuse laws, even when the characters are over 18”. Really? So where do children come into it? Did I miss you say that it? What is the alleged problem? Any details of this case? Name of the novel, or defendant?
A: Here is a link to the details of the case – it is an ageplay-themed novel, and the adult characters allude to earlier times of life, which seems to be the basis for the prosecution.
Q: Regarding the question about UK real-CSAM prosecutions falling. If this were a case of low-hanging fruit (lots of fictional material on the clearweb) why would this lead to a DEcrease in prosecutions for real CSAM? Why wouldn’t the overall prosecutions increase?
A: We need to dig deeper to answer this definitively, but our hypothesis is that resources are limited. We hear this from other jurisdictions also. There is always a backlog of images to be processed. So virtual content displaces real content, and since there is no differentiation between them, this goes unnoticed. Until now.
