Google-owned YouTube is being criticized for allowing kids programming with violent and explicit videos which are usually aimed at older audiences. Therefore, the company has toughened its approach to protect kids from such content on YouTube and YouTube Kids.
Google usually cracks down on the disturbing cartoons, but some videos, such as cartoons disguised as age-appropriate, slip through the cracks, making the job increasingly difficult.
In a blog post, YouTube said it had toughened its guidelines for acceptable content, and had in the past week kicked off 50 channels and removed "thousands" of videos under the guidelines. Johanna Wright, Vice President of Product Management at YouTube said, "To help surface potentially volatile content, we are applying machine learning technology and automated tools to quickly find and escalate for human review. "
Videos flagged as inappropriate for kids will be slapped with an age-restriction warning on its main YouTube app. This will prevent them from being pulled onto the Kids app. Only those logged into YouTube who is over the age of 18 will be able to view it. Videos flagged with an age restriction (18 and older) are already banned from YouTube Kids. The app is geared toward children under 13.
Also, videos found to have inappropriate (“abhorrent”) comments about the kids in them will have comments turned off altogether.
The company announced that it will roll out the new policy in the coming weeks.