Google has followed-up Facebook's announcements over tackling terrorist content on its platform last week, claiming that it will also deploy artificial intelligence to clean-up its own platform of terrorist-supporting content.
"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all," Kent Walker, general counsel at Google, said in a blog post.
"Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services."
In a bid to better crack down on such content, Google has announced that it will plough more investment into its machine learning technology to improve its ability to automatically detect and remove terrorist content, while keeping innocent videos, such as news reports, out of the dragnet.
However, because artificial intelligence isn't necessarily all its cracked up to be, Google has pledged to hire a bunch of new staffers to keep the computers in check.
The company said that it will "greatly increase" the number of independent experts in YouTube's Trusted Flagger programme, noting: "Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech."
Step three of Google's four-step plan is to take a tougher stance on videos that don't blatantly violate its policies, but which still arguably fall on the wrong side of the line.
For example, videos that contain inflammatory religious or supremacist content will be placed behind a warning and will be prevented from getting ad revenue, comments or viewing recommendations.
Finally, Google said that, through a partnership with Jigsaw, YouTube will implement its ‘Redirect Method' more broadly across Europe.
"This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining," Google explains.
Google's announcement that it will crack down on terrorist material comes just days after prime minister Theresa May said that social media firms should be fined for failing to remove extremist content.
Microsoft claims Check Point's methodology is all wrong - figure more like five million, not 250 million
Microsoft's explanation still raises as many questions as it answers
Wikileaks dumps info on 'Brutal Kangeroo', the CIA's malware toolkit for hacking 'air-gapped' networks
CIA's Brutal Kangeroo malware suite likened to Stuxnet
Commuters less than chuffed - many fined for not having a ticket