The Christchurch attack that left no less than 50 people dead has placed tech giants in the spotlight. The deputy head of the European Commission made an announcement calling for tighter regulations in the nearest future.
Last week, Alphabet’s Google, Facebook, and Twitter were accused of doing nothing to combat the spread of fake news, after they had signed a voluntary code of conduct to comply with the terms of the European Parliament before the elections in May.
The CNN has reported that about 4000 people had viewed the video before Facebook pulled it down. The killings were live-streamed via the attacker’s Facebook profile for 17 minutes. A little lesser than 200 people watched it live, and the video had its first user report 12 minutes after the show had ended. By then, it was quite late to prevent other people from watching the video because it had been shared multiple times, Facebook said.
It said further that it had to hashtag the video so that a look-alike would be automatically taken down. However, this is not the case with videos that have been edited or recorded from a screen. The ones in this category have proven to be difficult to delete automatically. Facebook affirmed that it was working all round the clock with the New Zealand Police to wipe off every trace of the extremist material from the platform.
Nevertheless, some leaders think that tech giants are not doing enough. Scott Morrison has expressed his disappointment at the tech giants for the unrestricted role in terrorist attacks. By being complacent, they only allow extreme content to thrive and encourage other people to do the same, as people are greatly influenced by what they see and hear.
EU’s Timmerman has expressed displeasure over the live video and Facebook’s “complacency” in protecting the citizens. “At some point, we would have to regulate. The first task of any public authority is to protect its citizens- and if we see you (tech giants) as a threat to our citizens, we will regulate, and if you don’t work with us, we will probably regulate badly”.
Rory Cellan-Jones, a BBC technology correspondent, thinks that Facebook may have been proactive after the incidence; however, giving access to three billion people to view a live broadcasting platform should be moderated.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.