More than 200 child advocacy organizations and researchers just sent an open letter to YouTube CEO Neal Mohan and Google CEO Sundar Pichai with a blunt demand: ban AI-generated slop from YouTube Kids entirely. The letter, organized by Fairplay and signed by the American Federation of Teachers, the American Counseling Association, and Jonathan Haidt among others, calls AI slop on children's platforms a growing crisis that nobody is taking seriously enough.

The Numbers Are Staggering

A New York Times investigation from March 2026 found that roughly 40% of videos recommended to children on both the main YouTube platform and YouTube Kids appear to be AI-generated slop. These are not carefully crafted educational videos. They are plotless, repetitive, mass-produced clips designed to hold a child's attention long enough to serve ads.

And they are extremely profitable. According to Fairplay's research, the top AI slop channels targeting children earn over $4.25 million in annual revenue. Some creators openly advertise their earnings from what they describe as "plotless, mesmerizing AI content." The economics are simple: near-zero production costs plus a captive audience of children who cannot tell the difference equals easy money.

What the Letter Demands

The coalition is asking for three specific things:

YouTube's Response

YouTube spokesperson Boot Bullwinkle stated that YouTube has "high standards for the content in YouTube Kids, including limiting AI-generated content in the app to a small set of high-quality channels." YouTube also confirmed it is developing dedicated AI labels for YouTube Kids but did not provide a timeline. That is corporate-speak for "we know it is a problem and we are not going to fix it quickly."

Why This Matters for AI Creators

If you make AI content — even good AI content with real stories and real effort behind it — this affects you. Every piece of AI slop that farms kids for ad revenue makes it harder for platforms to trust AI creators generally. The inevitable regulation will not distinguish between a mass-produced slop factory and a solo creator making a show like Fruit Love Island with genuine storytelling.

The AI content industry needs to self-regulate before regulators do it for us. That means labeling your content honestly, not targeting children with low-effort garbage, and building things that actually deserve an audience. The alternative is a blanket crackdown that hurts everyone.