YouTube has recently announced a new policy change for treatment of videos targeted toward minors and young children. The company says it will now remove all content that contains violent or mature themes if it is targeted toward kids, either through the title of the video, its description, or the accompanying tags.
YouTube says this type of content will no longer be allowed on the platform.
Before this new policy YouTube was age-restricting such videos, but now it’s going a step further to help clean up the platform and make it a safer place for children amid intense regulatory scrutiny and nonstop criticism of its executive leadership.
The policy change was announced two days ago, but it was done so on a YouTube Help community forum and appears to have gone largely unnoticed, with the post amassing only 20 replies and little news coverage.
YouTube says it will begin ramping up enforcement of this new policy over the next 30 days, to give creators a chance to become familiar with the new rules.
As part of the process, YouTube says it will remove videos that violate the policy, but it won’t be giving strikes to channels until the 30-day period is up.
YouTube says it won’t be handing out strikes to videos uploaded prior to the policy change, but it still reserves the right to remove those videos.
YouTube advises creators check the YouTube Kids guidelines if they want to specifically reach children with their videos, and it also advises creators to make sure their descriptions and tags are targeting the right audience to avoid getting caught up in the ban.
YouTube also says it will be age-restricting more content that could be confusingly viewed as kid-friendly, like adult cartoons.
YouTube gives some examples of offending content, like videos tagged as for children that feature family-friendly cartoons engaging in otherwise violent or disturbing activity, like injecting needles. YouTube also warns against content featuring nursery rhymes that engage with mature themes like sex, violence etc.