- Share this article on Facebook
- Share this article on Twitter
- Share this article on Email
- Show additional share options
- Share this article on Print
- Share this article on Comment
- Share this article on Whatsapp
- Share this article on Linkedin
- Share this article on Reddit
- Share this article on Pinit
- Share this article on Tumblr
YouTube says it has removed more than 17,000 channels for hate speech, representing a spike in takedowns since its new hate speech policy went into effect in June.
The Google-owned company calls the June update — in which YouTube said it would specifically prohibit videos that glorify Nazi ideology or deny documented violent events like the Holocaust — a “fundamental shift in our policies” that resulted in the takedown of more than 100,000 individual videos during the second quarter of the year. The number of comments removed during the same period doubled to over 500 million, in part due to the new hate speech policy.
YouTube said that the 30,000 videos it had removed in the last month represented 3 percent of the views that knitting videos generated during the same period.
“We’ve been removing harmful content since YouTube started, but our investment in this work has accelerated in recent years,” the company wrote in a Sept. 3 blog post detailing its efforts to clean up its platform through the removal of videos that violate its standards.
YouTube, which has 2 billion monthly logged-in users, has come under fire in recent years for its struggle to keep up with the constant influx of new content that promotes violence, hate speech or misinformation. The company has taken steps to patrol its platform and remove such content, relying on its machine learning technology to help detect videos that are in need of human review. In the blog post, YouTube said an update to its spam detection system during the second quarter led to a 50 percent increase in the number of channels that were terminated for violating its spam policies.
Together with Google, YouTube also has a team of over 10,000 people who focus on detecting, reviewing and removing videos that violate its policies.
YouTube says the videos removed represented a five-times increase compared with the previous three months. Still, in early August the ADL’s Center on Extremism reported finding “a significant number of channels” that continue to spread anti-Semitic and white supremacist content.
Sign up for THR news straight to your inbox every day