Friday, 24 November, 2017

YouTube Creates Vetting Policy to Safeguard Children's Videos

Screen Shot 2017 11 09 A screen shot from one of the disturbing videos on YouTube. YouTube
Theresa Hayes | 10 November, 2017, 21:23

YouTube announced in August 2017 that videos that "made inappropriate use of family friendly characters" would no longer be allowed monetization options.

"Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetisation", Juniper Downs, YouTube's director of policy, said in a statement.

Today's step is the next move in policing Google's new YouTube product.

Since YouTube Kids launched in February 2015, the algorithimically-driven app has been criticized for lacking controls to restrict kid-unfriendly videos, as well as allowing commercially-oriented content targeted at kids. Speaking to The Verge, YouTube said the change was in the pipeline before this controversy arose, and that it isn't a direct result of it.

The YouTube app and its Kids variant have long had an issue with numerous unusual and outright inappropriate videos popping up with keywords that target kids and use family-friendly characters, but YouTube is reportedly planning to start filtering those videos out of the YouTube Kids app by age-restricting them when they're found. Some of them appear to being created by robots, which add various key words into videos in an attempt to play YouTube, while others seem to be actively made by people who are looking to disturb children who watch them. Per Variety: "Our systems work hard to filter out more mature content from the app". It has promised to age-restrict content that gets flagged by users.

Age-restricted videos can't be seen by users who aren't logged in, or by those who have entered their age as below 18 on both the site and the app.

This new age-restriction policy should prevent that from happening by stopping inappropriate content from ever making it to YouTube Kids.

YouTube is trying to walk a fine line between owning up to this problem and arguing that the issue is relatively minor. (We won't even link to them because ugh, why give them more views.) Think Spider-Man squeezing large water balloons until they explode in slow motion while he sits in an empty bathtub (seen by an Australian dad and his 3-year-old) or Peppa Pig parodies that have her getting tortured at the dentist or drinking bleach.

YouTube Kids has over 30 billion views since its 2015 launch. Before any video appears in the YouTube Kids app, it's filtered by algorithms that are supposed to identify appropriate children's content and content that could be inappropriate or in violation of any YouTube policies. And, the company is willing to forgo additional ad revenue - and there is a lot of money flowing through this segment of the industry - if that's what it takes to ensure YouTube Kids feels like a safe experience for families.