YouTube Begins Quarantine of Hate Videos

As of yesterday, YouTube has made it more difficult to find, share, and make money from videos containing hate speech amid a broader crack down by Internet companies to restrict racist content following the violence in Charlottesville, Va. earlier this month.

However, YouTube's changes have been in the works for months in response to pressure from both advertisers and governments to rein in messages that promote hate or incite violence. According to a plan laid out in June, videos on YouTube promoting extremism or terror will be identified and removed using a mix of artificial intelligence and human monitors, including people from nonprofit organizations focused on issues like hate speech and terrorist recruiting.

Get Data Sheet, Fortune's technology newsletter.

YouTube will not necessarily remove videos flagged as containing "inflammatory religious or supremacist content," but it will instead reduce their visibility. They will not be recommended to visitors, will appear with a warning, will lack comments, and will run without ads. Additionally, users will be unable to embed flagged video on external sites, which will likely to drastically reduce their reach, according to Bloomberg.

YouTube, which is owned by Google’s parent company, Alphabet, believes this "strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints."

Other Internet services have made more sweeping moves against hate speech and supremacist groups in recent days. Both GoDaddy and revoked domain name registration for the neo-Nazi Daily Stormer site. Pay and security servicer Cloudflare also pulled support for the site, prompting concerns that some companies have too much power over online speech.

A YouTube spokesperson told Bloomberg that video creators who feel their work has been misclassified under the new program would be able to appeal the decision.

See original article on Fortune.com

More from Fortune.com

Advertisement