Technology
French Muslim group sues Facebook, YouTube for Christchurch video
When that little white box on Facebook asks you, “What’s on your mind?”, could Facebook be responsible for what you have to say?
A group representing French Muslims is suing the French branches of Facebook and YouTube for hosting the video of the Christchurch attack in New Zealand. The group, the French Council of the Muslim Faith (CFCM), says that by enabling the live streaming of the mosque shooting, that killed 50, the tech platforms were “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor,” according to the BBC.
The terrorist who allegedly murdered 50 people and injured 50 more at two mosques in New Zealand on March 15 live streamed the attack of the first mosque on Facebook. Facebook has since said that it removed the video 12 minutes after the completion of the live stream. But that didn’t stop the video from spreading.
Facebook removed 1.5 million versions of the video in just the first 24 hours after the attack. The video also made its way to other sites across the internet, notably YouTube, where users reported that they could still find versions of the video hours after the attack.
Facebook and YouTube have both stressed that they are cooperating with law enforcement and continuing to work to stop the spread of these videos. But that might be too little, too late.
Generally, despite having terms that disallow posting of violent material, Terms of Service also protect social media sites from being legally liable for what their users post.
That precedent may be changing. The controversial FOSTA-SESTA act in the US, initially allegedly intended to combat sex trafficking, actually caused websites to be more diligent about hosting content related to sex, because it made them potentially legally liable for any illicit sexual activities taking place on their platform. Relatedly, European lawmakers just passed a law that makes internet companies legally and financially liable for the spread of copyrighted material on their platforms.
While these two instances are not directly related to liability for violent content, they both challenge the precedent that social media sites are not ultimately responsible for what their users post. That, dovetailing with growing sentiment that Terms Of Service are not sufficient to govern and enforce conduct, content, and privacy on social media, could add credence to the case.
A spokesman for Federation of Islamic Associations of New Zealand (FIANZ) told Reuters that he supported the French group’s action.
“They have failed big time, this was a person who was looking for an audience and … you were the platform he chose to advertise himself and his heinous crime,” FIANZ spokesman Anwar Ghani told Reuters. “We haven’t been in touch with the (French) group … but certainly something which can deter the social media space in terms of these types of crimes, we would be supportive of that.”
-
Entertainment6 days ago
WordPress.org’s login page demands you pledge loyalty to pineapple pizza
-
Entertainment7 days ago
Rules for blocking or going no contact after a breakup
-
Entertainment6 days ago
‘Mufasa: The Lion King’ review: Can Barry Jenkins break the Disney machine?
-
Entertainment5 days ago
OpenAI’s plan to make ChatGPT the ‘everything app’ has never been more clear
-
Entertainment4 days ago
‘The Last Showgirl’ review: Pamela Anderson leads a shattering ensemble as an aging burlesque entertainer
-
Entertainment5 days ago
How to watch NFL Christmas Gameday and Beyoncé halftime
-
Entertainment3 days ago
‘The Room Next Door’ review: Tilda Swinton and Julianne Moore are magnificent
-
Entertainment4 days ago
Polyamorous influencer breakups: What happens when hypervisible relationships end