Following the criticism for failing to protect the footage of the Christchurch mosque shooting from further spreading on its platform, Facebook says it will be given access to footage from bodycams worn by Specialist Firearms Command officers during their training. The social network says it will deploy the footage to train its algorithms to recognize videos of real-life shootings.
During the process of the training, Facebook will give the cameras to Metropolitan Police Specialist Firearms Command officers.
Facebook and other big techs are scheduled to face questioning from US senators about their effort to remove violent content and prevent the spread of misinformation on their platforms. The hearing will examine extremes, online, and how effective the industries have been to take down these violent contents from their platforms. The witnesses would also discuss how tech companies are working with the law enforcement agency to process the removal of violent content.
“We did not have enough content depicting first-person footage of violent events to effectively train our machine our machine learning technology,” it said in a blog as a reason for not totally combating the spread of violent content. The social network employs both human element and artificial intelligence systems to take down violent content. Unfortunately, however, this hasn’t helped either.
The assistant commissioner for specialist operations Neil Basu acknowledged that any technology which succeeds in automatically stopping live streaming of attacks once identified will significantly help combat the glorification of such violent act and the “promotion of the toxic ideologies that drive them.
Christopher Tegho, a software engineer who specializes in video understanding, explained that the project would take a long time to come to fruition.
“This is definitely a difficult problem and I don’t see Facebook solving it soon. But we are still not close to the same accuracy levels at being able to recognise what is going on in a scene of a video as we are to recognising what is in a still image.”
The AI systems employed by Facebook has been able to remove over 26 million posts on extremism related religious groups such as Islamic State and Al Qaeda. It also said it banned over 200 racist groups that promoted white supremacy.
The social network considered video restrictions after the New Zealand attacks that were live-streamed. However, restricting the videos may not be the best approach to curb the menace. Let’s all hope that the plan to use police videos will be a stepping stone to reducing this lingering menace.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.