[colabot1]
The Supreme Court declined to address the legal liability shield that shields tech platforms from liability for their users’ posts, the court said in a statement. unsigned notice THURSDAY.
The ruling leaves in place, for now, a broad liability shield that protects companies like Twitter, Meta Facebook and Instagram as well as from google YouTube to be held responsible for the speech of its users on its platforms.
The court’s rulings in these cases will be a big sigh of relief for tech platforms for now, but many members of Congress are still eager to reform the legal liability shield.
In Gonzalez v. Google, the court said it would “decline to address the application” of Section 230 of the Communications Decency Act, the law that protects platforms from their users’ speech and also allows services to moderate or remove users. ‘ posts. The court said it made the decision because the complaint “appears to indicate little or no plausible claim for relief”.
The Supreme Court will send the case back to a lower court for reconsideration in light of its decision on a separate but similar case, Twitter v. Taamneh.
In this case, the family of a US victim of a terrorist attack sought to hold Twitter liable under anti-terrorism law for allegedly aiding and abetting the attack by failing to take sufficient action against terrorist content on its website. platform. In a decision written by Judge Clarence Thomas, the court ruled that such a claim could not be brought under that law.
“As alleged by the plaintiffs, the defendants designed virtual platforms and knowingly failed to do ‘enough’ to remove ISIS-affiliated users and ISIS-related content – among hundreds of millions of users worldwide and a huge ocean of content – from their platforms,” Thomas wrote in the court’s unanimous opinion.
“Yet the plaintiffs have not alleged that the defendants intentionally provided substantial assistance in attacking Reina or knowingly participated in the attack on Reina, let alone that the defendants aided ISIS in such a way pervasive and systemic that they blamed them for every ISIS attack,” he said. added, referring to the nightclub where the terrorist attack took place.
Many lawmakers view Section 230 as unnecessary protection for a massive industry, though its proponents say the law also protects smaller players from costly lawsuits because it allows cases involving user speech to be dismissed at a stage. earlier. Yet lawmakers remain divided on what form these changes should take, meaning there are still huge hurdles to making them happen.
WATCH: The mess of content moderation on Facebook, Twitter, YouTube
