The Christchurch massacre in which 49 people were killed was livestreamed on Facebook. One of the gunmen is learned to have made it broadcast for about 17 minutes while walking into a mosque and opening fire.
Facebook director of policy for Australia and New Zealand, Mia Garlick, said the horrific footage was removed immediately after being alerted by police. The accounts of shooter from Facebook and Instagram too were removed.
Garlick added, “(Facebook is) removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”
Twitter too responded immediately after being alerted by New Zealand police. The platform suspended account related to the shooting.
YouTube, owned by Google, said been aware of the footage and the company removes shocking, violent and graphic content from the social platform.
However, questions are now raised how such platforms handle offensive content and how quick they are in removing such videos.
The artificial intelligence tools and human moderators of Facebook failed to detect the horrific shooting livestream.
A senior adviser at the Counter Extremism Project, an international policy organization, Lucinda Creighton, said the tech companies don’t see this with priority and not even doing anything in preventing such thing from reappearing.
- Mexico using giant X-ray machines at borders to stop hidden migrants in trucks - July 10, 2019
- Solar Eclipse & Fear, Superstition, Violence - July 5, 2019
- Study finds Xinjiang schools separating Uighurs children from families - July 4, 2019