YouTube to Remove Videos Containing Hate Speech, Misinformation

Sean Gallup/Getty Images

The Google-owned streaming video platform said new policies are in effect to "limit the spread of violent extremist content online."

YouTube plans to remove videos and channels that contain hate speech as part of an effort to clean up extremist content on its platform. 

The Google-owned video site published a blog post Wednesday morning announcing that it has updated its hate-speech policy to specifically ban videos "alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status." That means removing videos that support white supremacy and neo-Nazi ideology, as well as content that denies the existence of violent events like the Holocaust or the Sandy Hook Elementary shooting. 

"One of the most complex and constantly evolving areas we deal with is hate speech," reads the blog post. "We’ve been taking a close look at our approach towards hateful content in consultation with dozens of experts in subjects like violent extremism, supremacism, civil rights, and free speech." 

The platform did not name any of the videos or channels that would be banned. Last summer, it terminated the channel of right-wing conspiracy theorist Alex Jones, which had more than 2 million subscribers, after the Infowars host sparked outrage for claiming the Sandy Hook massacre was a hoax. 

Videos that discuss topics like pending legislation or that aim to condemn or expose hate or provide analysis of current events will remain on the platform. "We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future," YouTube said. 

In addition to removing supremacist videos, the platform said it will expand its systems to limit recommendations of "borderline content and harmful misinformation" by the end of the year. The company gave examples of misinformation as "promoting a phony miracle cure for a serious illness or claiming the earth is flat." The company said that views on such videos have dropped by over 50 percent since it implemented this change in the U.S. in January.

YouTube also plans to strengthen its enforcement of existing YouTube Partner Program policies. "Channels that repeatedly brush up against our hate speech policies will be suspended from the YouTube Partner program, meaning they can’t run ads on their channel or use other monetization features like SuperChat," the company wrote.

The updates come as YouTube has faced criticism over its struggle to control the spread of violent, extremist, exploitative and misleading information on its platform. YouTube has made an effort to clarify its policies and add more manpower to the teams focused on addressing these issues, but problems have persisted. On Monday, a New York Times investigation exposed the ways in which YouTube's algorithm exposes children to exploitation from pedophiles. On the same day, the company announced that it would prohibit children from live-streaming without an adult present. 

Last year, several advertisers pulled their budgets from YouTube after their ads were found alongside hateful and exploitative videos. As a result, YouTube introduced new settings designed to provide greater control to advertisers over where their spots run. 

YouTube’s new policies will take effect immediately, though the company says it will take time for its systems to "fully ramp up."