Harmful to Minors

Happy May! I’m Caden, a technologist with COSL, and on behalf of COSL, I’d like to start the month by announcing the launch of a new project. Harmful to Minors is our new transparency archive designed to provide greater transparency when Internet content is taken down, or users are banned, on the grounds of sexuality, gender, and child safety. These reasons are frequently grouped together with others under the umbrella phrase harmful to minors.

Name: Harmful to Minors

Dates: April 2025 - Current

Description: A transparency register for child safety content takedowns

Contact: Caden

Contact Details: [email protected]

Website: https://harmfultominors.org

Type: Sponsored Project

Priority Area: Legal Advocacy

Censorship masquerading as child safety

We’ve previously touched on the phrase “harmful to minors”, which is an elastic concept that seems like it’s referring to hardcore pornography. But I’d like to elaborate on what makes the term so loaded. The phrase “harmful to minors” is actually a restriction on the conditions of the Miller test for obscenity in order to broaden what speech can be regulated. A work is “harmful to minors” if:

  • The work, taken as a whole, appeals to the prurient interest of minors
  • The work depicts or describes, in a patently offensive way, sexual conduct or excretory functions defined by applicable state law for minors
  • The work, taken as a whole, lacks serious literary, artistic, political, or scientific value for minors

Rather than being a concept that protects children from pornography and illegal content, the phrase “harmful to minors” allows lawmakers to regulate basically any speech they take issue with that can’t be defined as obscenity for adults. Instead, they can regulate it using child safety as a pretext.

In states such as Florida, for example, this justification is being used to remove important literary works such as “To Kill a Mockingbird” from library shelves. And in recent years, lawmakers have attempted to pass laws allowing for these same regulations to be applied to content on the internet. In 2023, at least eight different states attempted to pass legislation regulating content “harmful to minors” on the internet. In 2024, this number increased to at least nineteen. In this environment, companies have chosen to comply by implementing age verification for their services (a separate topic you can find out more about here), stop offering services in states with such legislation, or enforce more stringent moderation in order to prevent their services from being classified as “harmful to minors”.

The internet and the marginalized

This moderation, however, often inflicts substantial harm against marginalized communities whose behavior is unfairly classified as “harmful to minors” and have relied on online platforms to form their communities. This is often seen with LGBTQ content, which is one of the first targets whenever platforms embark on a crackdown against “pornographic” content.

For example, back in 2018, in the wake of Tumblr’s removal from the iOS store, the company began enforcing an adult content ban, which, despite undergoing numerous modifications over the years, remains in place today. What the company considered adult content happened to include documentation of trans masculine surgeries and art blogs depicting LGBTQ sexuality. And yet, this content was caught in Tumblr’s purge of anything and everything adult. In the aftermath of the adult content ban, users left the website in droves, many of whom identified as LGBTQ.

Even on platforms tolerant of content catered towards adults, content is often unfairly moderated as obscene and harmful. This is the case with “dead dove” media: written fiction and art that contain controversial themes. Such content is often removed because platforms and moderators believe such content can influence illegal behavior in the real world. This cannot be further from the truth. The research, instead, points to the opposite: depictions in media do not have a causal link to performing such acts in the real world. In fact, removing such content is often harmful to another group of individuals: sexual assault survivors. It is well established among therapists that fiction can be used as an outlet by survivors as a way of processing trauma. By removing this outlet, which again, does not lead to violence in the real world, we do survivors and other individuals a great disservice.

Transparency

The current tools available online, however, do not offer people a way to fight these unjust bans and takedowns. If a platform does offer an individual an opportunity to appeal, individuals are often met with silence or a refusal to act from the platforms the action occurred on. Moderation remains an opaque process with no way to hold those involved accountable or to demand changes to moderation policy. This is a system that, for marginalized communities, is reckless at best and malicious at worst, causing harm to the most vulnerable.

The damage that “harmful to minors” policies are inflicting onto these communities is what prompted me to work on the Harmful to Minors transparency archive here at COSL. This archive logs any bans and takedowns that occur on social media platforms under the grounds of sexuality, gender, or child safety. By making information on these actions available publicly, we hope to make it easier for the public to look into the moderation process, and to hopefully push for changes that will make online content moderation and regulation better for everyone and not just for those with the authority.

Responses to censorship

Just as important as logging censorship requests is logging how they’re responded to—and Harmful to Minors does that too. Sometimes, you’ll find that the content wasn’t taken down, because the host or platform pushed back. Other times, it might be that the content was taken down, and that this was probably the right choice. For example, there’s nothing wrong with taking down pornographic content from a website that isn’t meant for porn.

So it’s important to note that our inclusion of a takedown request or ban in the Harmful to Minors archive doesn’t necessarily amount to a judgment about how it was dealt with. Rather, what we’re trying to do is to shed light on how content about sex and gender is being censored, along with the pushback. We’re also hoping to document any patterns that emerge, as the Internet shifts towards being less welcoming to sexual content in general, and LGBTQ+ expression in particular.

Something else that Harmful to Minors aims to do is to educate. We categorize the notices that we receive by topic (such as nudity and obscenity), and include answers to relevant frequently asked questions that they bring up (such as, what is the difference between obscenity and CSAM?). This educational material is still being built out, and we welcome contributions.

Submitting to Harmful to Minors

If you recognize what we’re doing, that’s because it’s not entirely an original idea. We’re big fans of Harvard University’s Lumen Database (no relation to Lumon Industries), a longstanding transparency archive mostly devoted to copyright-based takedowns. Lumen doesn’t cover takedowns or bans over sexual or gender themed content, so our Harmful to Minors project is designed to fill that gap.

Our transparency archive can be found here, and we already have a few cases available to browse. Just to name a few examples, our archive already documents the censorship of a support group, an ethnographic blog, a visual novel, and furry fandom artwork. Normally these sorts of decisions fly under the radar—but not any more. Harmful to Minors exists to hold those who are responsible, accountable for their decisions.

However, we need more information on what moderation actions are taking place each day! For now, the best way for you to contribute examples of takedowns or bans is to submit this information via our website or by using the Report a Moderation Action button on the website. The Lumen database also receives an automatic feed of takedowns from large platforms such as Google as part of its automatic logging of copyright demands. In the future, we plan to support this mode of submission also, and would welcome expressions of interest from any platforms that wish to integrate Harmful to Minors logging as part of their moderation workflow.

Conclusion

The Harmful to Minors project is a vital step toward transparency in an era where “child safety” is increasingly used to justify censorship of lawful content, from support groups to artistic expression. By documenting takedowns, responses, and emerging patterns, we aim to hold decision-makers accountable and foster informed discussions about online freedom, particularly for sexual and LGBTQ+ content. Join us in this mission—submit examples, explore our growing archive, and help shape a more open and equitable internet. Visit harmfultominors.org to get involved today!



Support the project through a donation

The Harmful to Minors project is proudly supported by our Legal Advocacy fundraiser. Every donation helps run and maintain projects that fall under this priority area. Thank you for your support!

Top Donation $0.00
Average Donation $0.00

Oh hi there 👋
It’s nice to meet you.

Sign up to receive notifications of our new blog posts in your inbox.

We don’t spam! Read our privacy policy for more info.

Leave a Reply

Your email address will not be published. Required fields are marked *