Google had found that its robots are now superior to humans at identifying and blocking extremist videos, saying that they flag three in four offensive videos from YouTube before they are reported by users. The tech giant is using machine learning along with human reviewers as part of a multipronged approach to tackle the spread of extremist and controversial videos across YouTube, which also includes tougher standards for videos and the recruitment of more experts to flag content in need of review.
The search giant said it more than doubled the number of illegal videos deleted from its platform last month after it adopted artificial intelligence moderators to help police content.
"With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge," said Google. "But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which we’ve taken this kind of content down."
A month after announcing the changes, and following UK home secretary Amber Rudd’s repeated calls for US technology firms to do more to tackle the rise of extremist content, Google’s YouTube has said that its machine learning systems have already made great leaps in tackling the problem.
A YouTube spokesperson said: “While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed. It also said that videos that have been reported by users but do not contain illegal material may be restricted so that they can't make money, and "likes" or comments will be blocked.
"If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state," said Google. It comes after major companies including Marks and Spencer and McDonald's withdrew their business from YouTube earlier this year after discovering their adverts had appeared alongside extremist videos.