This article was published on March 21, 2019

YouTube’s ‘child safe’ app is crawling with videos advocating suicide, murder, and sexual exploitation


YouTube’s ‘child safe’ app is crawling with videos advocating suicide, murder, and sexual exploitation

For nearly five minutes, the cartoon is predictable. Colorful characters with child-like voices act out the types of scenarios capable of reversing even the wildest tantrums, turning screaming toddlers into tiny, submissive humans. It’s the optimal time for a man to enter the frame, telling your small child the proper way to slit their wrists.

YouTube Kids is supposed to be the kid-friendly choice for discerning parents. It’s devoid of the conspiracy theories and misinformation that runs rampant on the main site and features cute and cuddly kids programing tailored to the youngest of children. It’s supposed to be inoffensive, a solution for parents needing 10 minutes to shower, or some distraction that allows them to prepare dinner. Parents though, are increasingly learning that this isn’t always the case.

In this video, spotted by Free Hess, a pediatrician, child safety advocate, and mother, a man known as “Filthy Frank” enters the frame of a kid-friendly clip from the game Splatoon. His appearance is brief, first popping in at around 4:44 and then exiting just 10 seconds later. Before he goes he leaves children with chilling instructions on how to best to slit their own wrists: “Remember kids, sideways for attention, longways for results. End it.”

YouTube, for its part, is scrambling to find a solution as a means of proactively combatting this type of offensive content. Currently, the system relies on concerned viewers to flag offensive content. Flagged content is later viewed by a human moderator who determines whether it violated the rules, and decides whether it can remain on the platform.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

It’s an uphill battle, to be sure. But flagging doesn’t appear to be the solution.  All of the videos shown below were found in a less than 24 hours. They’ve been shared among Hess’ followers, flagged, and allowed to remain on YouTube Kids. The suicide instruction video was first flagged eight months ago. It was live until last month. Numerous others are currently live on the platform, some of which have been there for months.

There’s a Minecraft video that features adult language in the beginning before quickly pivoting footage of a low-resolution school shooting. It’s not the only one.

Next there’s what appears to be an example of human trafficking, featuring Billie Eilish’s haunting vocals in “Lovely” playing in the background. The male character in the anime-style cartoon purchases a female in a wolf costume for $400 from what appears to be another male. The female character is drawn in a way that would suggest she was previously beaten, with stitches covering the bulk of her face, and a confession later in the animation that she’s continually been mistreated. “Why are you scared?” the presumed John, named Alec, asks. “Because everyone mistreats me,” the victim responds.

Another video in the same style features threats of violence, adult language, and murder. It’s set to the song “Sweet but Psycho” by Ava Max.

The music in both videos isn’t coincidental, and it’s worth mentioning. Both artists, Ava Max and Billie Eilish are popular with teens and younger children, and their tracks are featured prominently in another teen-favorite app, TikTok. Max’s “Sweet But Psycho,” for example, has been used in more than two million videos according to its internal search feature.

In another example, in the same anime style, a young girl commits suicide with a knife after her boyfriend broke up with her and her father died.

These are but a handful. We should note that these are just a small sampling, and these videos aren’t difficult to find for those searching for them. Unfortunately, many are finding them by accident, and those are typically young children who may not understand the theme elements they’re being exposed to. In just a few minutes, I ran across several of these videos, some featuring sexual references, real or threatened violence, adult language, and drug advocacy.

According to YouTube’s website, this is a platform meant to be “a world of learning and fun, made just for kids.”

We created YouTube Kids to make it safer and simpler for kids to explore the world through online video – from their favorite shows and music to learning how to build a model volcano (or make slime ;-), and everything in between. There’s also a whole suite of parental controls, so you can tailor the experience to your family’s needs.

Clearly we have a different idea of what constitutes fun and learning, especially as both pertain to children.

Update (May 22, 2019): Commenting on what YouTube is doing to stop offensive videos from entering the YouTube Kids app, a company spokesperson told TNW:

We work to ensure the videos in YouTube Kids are family-friendly and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed. We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with