- Share this article on Facebook
- Share this article on Twitter
- Share this article on Email
- Show additional share options
- Share this article on Print
- Share this article on Comment
- Share this article on Whatsapp
- Share this article on Linkedin
- Share this article on Reddit
- Share this article on Pinit
- Share this article on Tumblr
YouTube is making changes to how it treats videos posted by minors, including prohibiting them from live-streaming unless an adult is present, in response to renewed concerns that the platform is being used to exploit children.
The Google-owned video site published a blog post on Monday morning outlining the policies it has in place to protect young people who use it to post and view videos. The post follows a New York Times investigation also published on Monday that exposes the ways in which YouTube’s algorithm promotes videos of children that can be exploited by pedophiles.
In the post, YouTube says it has updated its policy around live-streaming to bar minors from using the broadcasting feature unless they are accompanied by an adult. YouTube says it could take away live-streaming privileges for channels that don’t comply with the rule. In recent months, YouTube has has also disabled comments on videos featuring children “to limit the risk of exploitation” and has reduced recommendations of what it calls “borderline content” featuring children in “risky situations.”
As YouTube has grown — it now has more than 2 billion monthly users — it has struggled to stamp out inappropriate, misleading and exploitative videos. Despite trying to make its platform safer for children by releasing a more curated YouTube Kids app in 2015, YouTube continues to be used to take advantage of innocent young children.
In the latest report, The Times outlined how YouTube’s recommendation system takes viewers down a “rabbit hole” that can lead to more extreme and inappropriate content, including videos of younger children. The result, per the report, is “a catalog of videos that experts say sexualizes children.”
While YouTube says it is committed to eliminating these issues, The Times reports that the company will not turn off its recommendation algorithm on videos of children. Recommendations are one of the primary ways that YouTube keeps people engaged on the platform and drives lengthy watch times.
YouTube, which is led by CEO Susan Wojcicki, says it takes the safety of children and families seriously. Children under 13 are not allowed to use the platform. During the first three months of the year, it removed more than 800,000 videos that violated its child safety policies.