Technology
French Muslim group sues Facebook, YouTube for Christchurch video
When that little white box on Facebook asks you, “What’s on your mind?”, could Facebook be responsible for what you have to say?
A group representing French Muslims is suing the French branches of Facebook and YouTube for hosting the video of the Christchurch attack in New Zealand. The group, the French Council of the Muslim Faith (CFCM), says that by enabling the live streaming of the mosque shooting, that killed 50, the tech platforms were “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor,” according to the BBC.
The terrorist who allegedly murdered 50 people and injured 50 more at two mosques in New Zealand on March 15 live streamed the attack of the first mosque on Facebook. Facebook has since said that it removed the video 12 minutes after the completion of the live stream. But that didn’t stop the video from spreading.
Facebook removed 1.5 million versions of the video in just the first 24 hours after the attack. The video also made its way to other sites across the internet, notably YouTube, where users reported that they could still find versions of the video hours after the attack.
Facebook and YouTube have both stressed that they are cooperating with law enforcement and continuing to work to stop the spread of these videos. But that might be too little, too late.
Generally, despite having terms that disallow posting of violent material, Terms of Service also protect social media sites from being legally liable for what their users post.
That precedent may be changing. The controversial FOSTA-SESTA act in the US, initially allegedly intended to combat sex trafficking, actually caused websites to be more diligent about hosting content related to sex, because it made them potentially legally liable for any illicit sexual activities taking place on their platform. Relatedly, European lawmakers just passed a law that makes internet companies legally and financially liable for the spread of copyrighted material on their platforms.
While these two instances are not directly related to liability for violent content, they both challenge the precedent that social media sites are not ultimately responsible for what their users post. That, dovetailing with growing sentiment that Terms Of Service are not sufficient to govern and enforce conduct, content, and privacy on social media, could add credence to the case.
A spokesman for Federation of Islamic Associations of New Zealand (FIANZ) told Reuters that he supported the French group’s action.
“They have failed big time, this was a person who was looking for an audience and … you were the platform he chose to advertise himself and his heinous crime,” FIANZ spokesman Anwar Ghani told Reuters. “We haven’t been in touch with the (French) group … but certainly something which can deter the social media space in terms of these types of crimes, we would be supportive of that.”
-
Entertainment7 days ago
‘Interior Chinatown’ review: A very ambitious, very meta police procedural spoof
-
Entertainment6 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment6 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment5 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment4 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment3 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?