Met Police releases training video of work to stop terror attack live-streaming

24 October 2019, 08:33 | Updated: 24 October 2019, 08:48

Nick Hardinges

By Nick Hardinges

Facebook has worked with the UK's biggest police force to improve its ability to detect terror attacks live-streamed on its platform.

The social network site faced condemnation for its failure to prevent the spread of a video showing the New Zealand mosque shootings in March, which left 51 dead.

Critics of the tech giant said it took too long to remove the video from its platform after it had been watched about 4,000 times, with a couple of hundred of those watching during the live broadcast.

The Metropolitan Police joined forces with Facebook to help the Silicon Valley company improve its detection of live-streamed Christchurch-style terrorist attacks.

Footage released on Thursday morning aims to help the platform develop technology to identify terrorists using live-streams so they can tip police off early on and prevent the video being broadcast while the event unfolds.

The Met Police joined forces with Facebook for the training exercise
The Met Police joined forces with Facebook for the training exercise. Picture: PA

Commander Richard Smith, head of the Met's Counter Terrorism Command, said: "The footage we are capturing shows our highly-skilled firearms officers training to respond with the utmost expertise to a wide range of scenarios including the kind of attacks we want to stop terrorists broadcasting."

The social network company provided Met officers at the force's firearms training centres with body cameras.

Video footage shot on those cameras will help the tech giant's artificial intelligence identify the offending video streams more accurately and rapidly.

Some Facebook users downloaded the Christchurch footage and shared it widely online in the aftermath of the far-right attack.

The use of AI is vital for social media companies in spotting terrorist content and taking it down as quickly as possible.

However, with the incident in New Zealand Facebook claimed it had insufficient first-person footage of violent events for its AI system to analyse.

Police hope the training will help them react quicker to live-streamed violence
Police hope the training will help them react quicker to live-streamed violence. Picture: PA

Commander Smith said the tech giant "reached out" to them because of their experience in removing "online terrorist propaganda."

"The live-streaming of terrorist attacks is an incredibly distressing method of spreading toxic propaganda, so I am encouraged by Facebook's efforts to prevent such broadcasts," he added.

"Stopping this kind of material being published will potentially prevent the radicalisation of some vulnerable adults and children."

These videos will also be passed on to the Home Office who will share them with other tech firms to help them develop similar technology.

The London-based police force filmed their training exercises that simulated dangerous events such as hostage situations and terrorist incidents in multiple scenarios such as public transport and waterways.

For AI to be effective it needs a large library of different imagery for it to be able to learn to identify relevant footage.

UK police have established the world's first unit designed to work with online service providers to remove terrorist material online.

Comments

Loading...

Happening Now