The UK has been hit by three terror attacks, and a suspected fourth, over the past few months, causing death and pain to many people across the country.

As the UK government attempts to hold those responsible for the attacks to account, it is also asking how people can become radicalised and hold extremist views. One major culprit unfortunately is technology companies such as Facebook, Google, and Twitter, who are criticised for not doing enough to prevent the spread of online extemism across social networks.

In March earlier this year, the UK prime minister Theresa May’s office, Downing Street said that tech companies can and must do more to ensure dangerous content isn’t being promoted online.

A Downing Street spokesperson said:

“Social media companies have a responsibility when it comes to making sure this material is not disseminated and we have been clear repeatedly that we think that they can and must do more. We are always talking with them on how to achieve that. The ball is now in their court. We will see how they respond.”

This week, Google, which owns the popular video platform YouTube, announced the new steps it is taking to prevent terrorism on its networks. In a blog post Google’s general counsel, Kent Walker, said:

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

“Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution.

“There should be no place for terrorist content on our services.”

These are the four ways that Google is pledging to deal with terrorism online.

1. Using machine learning to identify extremist and terrorism-related videos

Walker said that the tech giant has been using video analysis models to find and assess more than 50 percent of the terrorism-related content it has removed over the past six months. As a result, it will train its machine learning technology to identify and remove extremist content more quickly online.

Google isn’t the only platform doing this. Last week, Facebook announced it will be employing artificial intelligence (AI) software to identify terrorist content and remove it from the network.

2. Improve the YouTube Trusted Flagger programme

As well as relying on technology to flag up extremist videos and comments, Google needs to have people working alongside as well in order to help identify these “problematic videos”. It employs a group of experts through its Trusted Flagger programme to review and remove reported content, and Walker said these teams are accurate over 90 percent of the time.

Google has pledged to expand this programme by adding 50 expert NGOs to the 63 organisations it works with, with support from operational grants. As well, it will be expanding its work with counter-extremist groups to help identify content that attempts to radicalise and recruit extremists.

3. Making extremist videos harder to find

A few months ago, the company came under fire for allowing adverts to appear alongside extremist content, causing brands such as M&S, RBS and Sainsbury’s to pull their advertising from the platform. To combat this, Google has said it will take a “tougher stance” on videos that contain inflammatory religious or supremacist content. These videos will appear behind warnings and will not be monetised or eligible for comments, making them harder to find.

“We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints,” said Walker.

4. Improve counter-radicalisation efforts

Google’s subsidiary Jigsaw is dedicated to exploring how to use technology to counter extremism and one of the ways it does this is through its “Redirect Method”. This uses targeted online advertising to reach potential ISIS recruits and redirects them toward anti-terror videos that attempt to change their mind about joining. Google will work to implement this more broadly across Europe.

When this system has been used before, potential recruits have clicked through on the ads and watched over half a million minutes of video content that debunks terrorist recruiting messages. Google will work to implement this more broadly across Europe.

Walker said:

“Collectively, these changes will make a difference. And we’ll keep working on the problem until we get the balance right.

“Together, we can build lasting solutions that address the threats to our security and our freedoms. It is a sweeping and complex challenge. We are committed to playing our part.”