Why YouTube Kids is broken
dinsdag 7 oktober 2025 - 1355 woorden, 7 min read
As a parent of two daughters, I'm really annoyed by what YouTube Kids is doing—and, more specifically, by what they're not doing. Their promise is to give me agency, but in practice it feels broken. So I'm writing down my personal struggles and my future concerns.
Why YouTube Kids is broken
My daughters love to watch videos, and I can’t blame them.
As a nerd, I’m not overly concerned about the amount of screen time; what worries me is what they do within that screen time and at what cost. So let me show you what they could be watching on YouTube Kids.
Source: https://nosto.re/2d9d7a93685ddea163cc47baca4d432a6564b6a1eac1d3a608d22bb2d2e12743.mp4
What this video shows:
- A lot of boring or repetitive content when you select age-appropriate content for ages 5–8 on YouTube Kids.
- A lot of content seems optimized to generate anxiety (police, monsters, thieves, zombies).
- A lot of content isn’t in my child’s native language.
Other annoyances I’ve noticed in the app:
- My child can switch to other child profiles with different settings.
- YouTube Kids now serves portrait videos as well as landscape ones, which means my kids rotate the device constantly as they switch videos.
- I can’t add my partner to our Family Link setup without creating Google accounts for each of my children.
Curated kids-friendly content by YouTube is broken
This one-hour documentary is worth watching. It will change your mind how much YouTube is caring about our well-being (spoiler: it’s always profit over well-being).
💡 Below is a LLM (GPT-5 nano) generated summary of this video with the use of https://ppq.ai.
Transcript: https://youtubetranscript.org/video?videoId=wnK8VVUMuVs
TL;DR
The video argues that YouTube and YouTube Kids are unsafe for children due to pervasive exploitative content from anonymous “content farms,” NSFW ads, and the use of AI-generated videos. It alleges YouTube tolerates this for profit, shows examples of how content is orchestrated to target kids, and urges parents to monitor kids’ viewing and seek safer alternatives.
Key takeaways
- Of the 80% of parents whose children regularly use YouTube, 46% said their child has encountered inappropriate videos on the platform.
- Content farms target children by repurposing kid-friendly characters into disturbing, sexualized, or violent content to maximize clicks and revenue.
- YouTube Kids app and the regular YouTube homepage can surface inappropriate material due to algorithmic recommendations, even when videos are not explicitly marked for kids.
- Some channels openly promote NSFW content via private Patreon pages and links, raising concerns about monetization of child-targeted content.
- Coco Melon and other popular kids channels are critiqued for aggressive pacing, bright visuals, and other tactics designed to maximize child engagement.
- YouTube ads have become problematic, with sexually suggestive or graphic ads appearing, including AI-generated imagery and “companion” apps that prey on vulnerable users.
- The use of AI to create or alter videos (e.g., Runway’s video-to-video edits, minion filters, AI-generated “cute cat” channels with gruesome content) complicates moderation, as many AI-created works don’t fit existing YouTube’s detection mechanisms.
- There are real-world concerns and anecdotes about long-term effects on children, including increased anxiety, attention issues, and exposure to harmful material.
- The video argues that YouTube’s revenue model (advertising revenue split with creators) incentivizes inaction, and that public pressure or regulatory actions are needed to provoke change.
- Solutions proposed include vigilant parental monitoring, safer alternatives (like PBS Kids), and broader awareness to spur advertiser and platform action.
Notable quotes / moments
- “We do this only for the money.” — on the motivation behind some NSFW content channels creating videos targeted at children.
- “YouTube doesn’t want it to change. Money.” — the core claim about the platform’s incentives.
- “YouTube Kids isn’t safe either way.” — summary of the video’s stance on safety.
- “Spread awareness to your community is invaluable.” — call to action for viewers by the creator of this video.
Quick breakdown by topic
- YouTube Kids and algorithmic risk: Even though some content is labeled as made for kids, the platform’s search/results can still surface unsafe material due to automated recommendations and imperfect human review.
- Specific channels and practices: Examples include Spunky-based content farms (with shock thumbnails), channels advertising NSFW content via Patreon, and live-action/2D hybrids that push adult themes to children.
- AI and content moderation: Runway’s AI tools and AI-generated content (e.g., “cute cat” channels with gruesome imagery) pose new moderation challenges; many tools lack clear safeguards against harmful outputs.
- Advertising and risk: NSFW or suggestive ads appear in YouTube’s ecosystem, including AI-generated imagery and AI chatbot ads, sometimes reaching younger accounts.
- Real-world impact: Anecdotes and warnings about how early exposure to explicit or disturbing content can affect mental health, attention, and behavior in children.
- Call to action and remedies: Prioritize parental monitoring, seek safer alternatives, and push for advertiser pressure or media scrutiny to force platform changes.
Worth-checking resource list shared by the creator of this video: https://docs.google.com/document/d/1gmVdEKYCbuneNGkbOtRig9-RlVuNzc_r4sIo05bjaZE/edit?tab=t.0
Here are some of my personal notes and learnings after I watched the documentary:
- I’ve learned what the Elsagate is (I didn’t know of the existence of this earlier moderation problem on YouTube)
- YouTube is aware of this problem, but they don’t care
- To fix this problem, you have to take action on yourself. Don’t trust YouTube that they will fix this for you.
- I copied some interesting video comments here.
All the content is served by an algorithm. An algorithm that doesn’t seem to care about the well-being of my daughters. Why? Just look at the content itself, I don’t think I have to argue that this type of content is making them better. It’s content optimized for grabbing as much of their attention and make them addicted. And off-course there are content creators with the same intentions. They will always be gaming the algorithm while challenging the guidelines of YouTube. I can’t blame them, it’s the system what is the root cause.
How to fix this by myself & others
As a creative web-developer and someone who understands how digital platforms work today, I think I know how this can be fixed. To fix this, you need to understand how the current system operates. There are many approaches to make a system less relevant or useful; one approach is to build an alternative system that works better—better for the people using it and better for our children’s mental health.
Let’s get back to my role as a parent. I’m probably the best person in the world to decide what’s best for my daughters, together with my partner. I already know what type of content I’d like to share with them, which they can watch. As they grow older, they’ll know what they like to watch and can ask me to look things up for them. If their interests change, they can ask me to remove content they don’t want like anymore.
Another nice thing is discovering new content together. In this discovery mode, I find it very important that I have agency and can navigate away from content when necessary. It’s not only about discovering new content; it’s also about discovering new interests from my daughters.
I think having agency as a parent is key here. That’s why there is Operation Kidstr.
More articles will be published while we’re in the ideation and discovery phase.
Within this initiative, I’ve taken on the responsibility to build a demo - https://kubo.watch. The goal of the demo is to prove that you can exercise full parental agency over what your children can do, based on trust. The demo focuses on video content, just like YouTube Kids. Here are some visual mockups of the demo.
Further progress on Operation Kidstr can be followed here:
- https://jumble.social/users/npub1kdstrkmhv0yx8pdqcf9ed8l26752gqprx68twg7qp5nsd7qtegnsr3nsze
- https://kubo.watch
If you’re interested in joining this initiative, please reach out. We’re open to ideas and to any kind of support.
Other resources I used while writing this article