Youtube Kids implements new policy to flag inappropriate videos targeted at children

A New York Times piece and a subsequent Medium post this week highlighted an ongoing problem with Youtube Kids — bizarre and disturbing videos aimed at young children using key words and popular children’s characters.

Now Youtube says it is putting in place a new process to age-restrict these types of videos in the main Youtube app.

“Age-restricted content is automatically not allowed in YouTube Kids.” YouTube told the Verge, which was one of the first to report the story.

But the new policy allows users to flag this type of inappropriate content in the main app, which has implications for the Kids app as well. However, Youtube said the change was not in direct response to recent coverage but that it had been formulating this new policy for a while.

Youtube Kids launched in 2015 to bring children suitable content but these sometimes gruesome videos portraying sex, drugs and violence have been sneaking their way in for some time.

Youtube originally addressed the issue by allowing the algorithm to weed out much of the inappropriate content but that clearly hasn’t been working.

One recent example highlighted in the Medium post was of the cartoon character Peppa Pig drinking bleach. Another video showed Peppa getting her teeth violently yanked at the dentist.

Obviously, these were not sanctioned videos made by the producers of Peppa Pig. What’s happening is that at first innocent programs for children could be replaced by disturbing videos not suitable for them. But, despite filters put in place to prevent such videos, they still showed up on the children’s platform.

In August of this year, Youtube announced it would not allow video creators to make money off of the “inappropriate use of family characters.” The new policy is now taking those measures one step further to prevent inappropriate videos from uploading to the Kids app.

Youtube has confirmed with TechCrunch that content uploaded in the main app does not automatically go into the the Kids app. Instead, it takes several days to populate. The new policy should now add an extra layer of protection beyond the filters already in place.

Youtube will also now provide dedicated human teams to review flagged videos 24/7 to ensure they don’t make it onto the Kids app. Further, parents already have tools to block certain channels they don’t like, turn search on or off and Youtube recently rolled out kid profiles.

Plenty of parents I’ve spoken to privately have had their concerns about Youtube Kids. Will this be enough to change their mind or to protect kids from seeing inappropriate content on Youtube Kids? We’ll have to wait to see how it goes here.

One thing is clear, Youtube Kids needs a lot more safeguards in place than the regular Youtube app to protect kids and filter out all the weird, gross and crazy content people come up with.

Published at Fri, 10 Nov 2017 01:23:56 +0000