Facebook Instant Articles

YouTube sucks at vetting racist videos for kids—so parents need to step in

A recent controversy was a great reminder that the video platform has a long way to go in protecting youngsters from inappropriate content.

YouTube sucks at vetting racist videos for kids—so parents need to step in

Screenshot of Pinkfong's "Dig It Up" video

If you have young kids, chances are they're obsessed with YouTube. With millions of videos to choose from, the platform gives them a whole world of cute cartoons and upbeat songs right at their fingertips. So it's tempting to enjoy some alone time while they surf a seemingly endless roster of videos, many of which seem harmless enough—at least on the surface. But to assume that uploads aimed at youngsters are properly assessed for inappropriate content would be a mistake.

The company's notoriously shoddy vetting process and unreliable recommendation algorithm has put seemingly innocent but deeply disturbing content into the hands of children all over the world. The unintended result is that cartoons containing violence and racism are only a few mistaken clicks away.

Unfortunately, many re-learned this last month courtesy of the company's latest gatekeeping failure, a cartoon called "Dina and the Prince." It served as a reminder that YouTube can't be trusted to keep your kids safe and parents need to step in to carefully screen content for their children.

A racist curse

YouTube came under fire once again on July 23, when the children’s channel My Pingu TV and it's fairytale video, "Dina and the Prince," caused an internet uproar. In the story, a white cartoon angel "loses her beauty" and turns black. When the magic takes effect, a now "ugly" Dina has been transformed from a white, straight-haired angel into a darker-skinned version with curls. The video was uploaded on July 17 and viewed more than 400,000 times before being taken down.

YouTube sucks at vetting racist videos for kids—so parents need to step in Photo: Courtesy of My Pingu TV
Left: Dina before "losing her beauty"
Right: Dina after the curse

Many users reacted with shock in the comments, calling for the video to be removed. The video also caused an explosive reaction on Twitter, where people were keen to express their outrage:


The message the story sends is undeniably racist and tells black kids they are only beautiful and worthy of love if they are white. Following a lengthy history of marginalization, it’s a huge deal for young people of colour to see themselves represented at all, but stumbling upon this video leaves them with the devastating message that their hair and skin colour make them ugly. In addition to lowering their self-esteem, this reinforces white supremacy and upholds structural racism.

But “Dina and the Prince” is hardly the first racist children's cartoon that the video giant has allowed young viewers to access—and this history of harmful content, much of it still live on the site, is highly problematic.

It’s happened before

Pretty much every parent is familiar with the channel Pinkfong, the brand behind “Baby Shark”—the uber-popular hit that's been played 221 million times on YouTube (and tortured many a present-day parent). But the South Korean company has tons of other videos for catchy children’s songs, including the super problematic “Dig It Up” track and dinosaur compilation.


In the cartoon, the token black character (of which there is, of course, only one for the sake of *diversity*) looks like he’s straight out of a minstrel show. His mouth is yellow and oversized, a prominent feature in minstrelsy which began as a form of “entertainment” during slavery and continued well into the Jim Crow era.

Minstrel shows involved white people putting on blackface to portray black people as idiotic, lazy and goofy characters (like singing and dancing slaves), playing on a host of racist stereotypes. In short, it was meant to mock and demean black people by reducing them to outlandish caricatures.

Cartoon kids on a log Photo: Pinkfong's "Dig It Up" dinosaur song and compilation video

Commenters pointed out how bad the character appears, with user L. Mellor writing: “GET RID OF THE YELLOW LIPS ON the Brown child. Looks too much like black face.” Santos D wrote: “Black kid needs a redesign. Looks like a caricature form the 1890s.”

Things go from bad to worse when you also notice that the kids in the cartoon are all wearing pith helmets, a relic of colonialism. The headpieces were worn during the 19th century by European colonizers in Asia, Africa and the Middle East before being popularized by military officers, quickly becoming a symbol of oppression.


While kids might not realize how problematic this is, it’s a parent’s job to make sure they aren’t actively supporting such a racist depiction by letting them carelessly view and sing along—ignoring the painful historical significance and helping to perpetuate racism.

"Dig It Up" was uploaded in 2015 and has been viewed by more than 21 million people to date, but despite its concerning implications and commenter outcry, it has yet to be removed.

No stranger to controversy

YouTube's consistent inability to properly vet children's videos has resulted in kids seeing some seriously traumatic content. Starting in 2014 and peaking in 2017, kids and parents suffered through the phenomenon of “Elsagate,” when thousands of violent and inappropriate videos were deemed safe by YouTube and the YouTube Kids app.

Cut into seemingly innocuous videos starring beloved characters were scenes of drug use, sex, alcohol and violence, ranging from Paw Patrol characters attempting suicide to Peppa Pig drinking bleach and engaging in self-harm. They were uploaded using popular child-friendly search terms like “education” and “nursery rhymes” to evade detection, and thus were not filtered out by the app.

Take this unofficial Peppa Pig creation, for example. At first glance, nothing seems fishy, and any unsuspecting parent would assume it’s a simple compilation of cute episodes for their child. But as the cartoon progresses, things get very dark as beloved characters are mutilated and tortured, leaving many frightened and unsuspecting kids traumatized and in tears.


YouTube’s official policy states that it is not a platform meant for kids under the age of 13. Yet programs and videos for kids younger than that are among the most popular on the site. According to a new study by the Pew Research Center, videos intended for children under 13 drew larger audiences and were posted by more popular channels than content aimed at teen viewers or adults. Videos of children appearing to be younger than 13 averaged almost three times more views than other groups.

So if it's mostly young kids using the site, why isn't YouTube doing more?

The company is slowly taking steps to improve the vetting process. Last year they gave parents using YouTube Kids the ability to select videos curated and approved by humans instead of trusting an algorithm. With this update, kids are supposed to only be able to find and watch content that has been cleared by YouTube staff.

But this clearly didn't stop videos like “Dina and the Prince” and other racist cartoons from finding a home on the platform and reaching millions. Because of this, the onus can't solely be on YouTube for failing to remove these videos—the larger part of the problem is that they're getting made in the first place. The responsibility resides with the companies creating them to stop polluting the platform—but with no promise of this happening, it's ultimately up to parents to keep a closer eye on what their kids are watching.

What can parents do?

If you want to protect your kids from the trauma of seeing these cartoons, there are some things you can do to ensure that your child is only watching appropriate videos during screen time.

  • If you find a video that contains problematic content, you can use the flagging feature to submit it to YouTube's staff for review.
  • By going to the official channel of your kids' favourite characters, parents can be confident that the cartoons are legitimate. Just make sure they're verified with a checkmark beside the username.
  • Use the restricted mode feature, available on YouTube's website and app. At top of the page in the upper right hand corner there are three dots. By hovering over them, you can select to turn the restricted mode feature on. This will activate an algorithm that determines if content is inappropriate by analyzing video data such as the title, language, keywords and video description. While YouTube admits the filter isn't 100 percent accurate, it's still one additional barrier between inappropriate videos and your child.
  • Watch videos before your kid does. Not always feasible, but with a platform filled with user-generated content, it's the only way to be sure of what your kid is viewing.
This article was originally published on Aug 14, 2019

Weekly Newsletter

Keep up with your baby's development, get the latest parenting content and receive special offers from our partners

I understand that I may withdraw my consent at any time.