crossorigin="anonymous"> crossorigin="anonymous"> How Two Supreme Court Cases Could Affect Internet Liability: Analysis Of Gonzalez V Google And Twitter V Tomne »

How Two Supreme Court Cases Could Affect Internet Liability: Analysis of Gonzalez v Google and Twitter v Tomne

How Two Supreme Court Cases Could Affect Internet Liability: Analysis of Gonzalez v Google and Twitter v Tomne

U.S. Supreme Court Hears Cases That Could Impact Section 230 and Internet Liability of Tech Companies

On February 23rd, the U.S. Supreme Court heard two cases that could change the way the internet works. In the first case, Gonzalez v. Google, the family of a victim of an ISIS terrorist attack in Paris argues that Google-owned YouTube should be held liable for recommending ISIS videos to users. In the second case, Twitter v. Tomne, the family of a victim of another ISIS attack in Istanbul seeks compensation from social media platforms on the grounds that they aided and abetted terrorism. At the heart of both cases is a provision known as Section 230 of the Communications Decency Act, which shields tech companies from liability for what users publish on their platforms.

Why the Two Cases Have Not Been Combined

There are two similar cases, but they were not combined for arguments before The Supreme Court. The reason is probably that the cases were decided on separate grounds as they worked up through the lower courts. In Gonzalez v. Google, the lower court dismissed the case based on Section 230, but the outcome of Twitter v. Tomne was not determined by Section 230. The more conventional grounds were used, and the case was dismissed because there were not sufficient allegations to make out a case that Twitter and other platforms had aided terrorism.

Arguments Against Section 230

Section 230 generally shields platforms from liability over what users do. However, once a platform crosses a threshold of being a publisher of its material, it no longer gets the protection. In the two cases, the families argue that the platforms should be held accountable despite the protection of Section 230. Specifically, the argument is that algorithms and where they fall into Section 230 analysis are at stake.

The companies’ Response

The companies make a couple of factual points that are not in dispute. One is that the terrorist attacks in which the family members died were not planned on the platforms. The argument is that these platforms allowed ISIS to flourish and that they should be liable for these specific attacks. Therefore, the lawyers for tech companies bring up the idea of whether they are aiding and abetting anything or not. The companies argue that they were merely offering the same services that they make available to everyone.

Justices’ Opinions

During the Gonzalez v. Google hearing, the justices compared the situation to a bookseller or telephone company. Justice Clarence Thomas compared holding a phone company liable for connecting someone to the head of ISIS to holding YouTube liable for making a recommendation to a viewer. He believed that making a neutral suggestion about something that someone has expressed interest in is not aiding and abetting.

Conclusion

Section 230 of the Communications Decency Act shields tech companies from liability for what users publish on their platforms. However, two cases before the Supreme Court could change this. In both cases, the families of victims of ISIS attacks argue that the platforms should be held accountable for recommending ISIS videos to users and allowing the terrorist organization to flourish. Although companies argue that they are not aiding and abetting anything, the court’s decision will impact the future of Section 230 and the liability of tech companies.

FAQS

Under what circumstances does Section 230 immunity apply to an ISP or website?

Section 230 of the Communications Decency Act provides immunity to internet service providers (ISPs) and websites for content that is created by third parties. Specifically, it protects them from being held liable for the content that users post on their platforms. However, this immunity only applies if the platform is not directly responsible for creating the content in question.

What does Section 230 not cover?

Although Section 230 provides broad immunity to ISPs and websites for third-party content, it does not protect them from all forms of liability. For example, it does not shield them from liability for content that they themselves create or develop. Additionally, Section 230 does not protect them from liability for violations of federal criminal law.

What federal law protects big tech companies from lawsuits and liability for spreading news and diverse opinions regardless of whether they are true or false information?

There is no federal law that specifically protects big tech companies from lawsuits and liability for spreading news and diverse opinions, regardless of whether they are true or false. However, Section 230 of the Communications Decency Act provides some protection to these companies for user-generated content on their platforms, as long as they do not create or develop the content in question.

Is Section 230 still in effect?

Yes, Section 230 is still in effect. It was enacted as part of the Communications Decency Act in 1996 and has been a controversial topic in recent years due to concerns about the role of tech companies in moderating online content. However, there have been no significant changes to the law as of February 2023.

Can a website or ISP be held liable for user-generated content under any circumstances?

While Section 230 provides broad immunity to ISPs and websites for user-generated content, there are some situations where they can still be held liable. For example, they can be held liable if they knowingly facilitate illegal activities on their platforms or if they fail to remove content that violates federal criminal law after being notified of its existence. Additionally, they can be held liable for content that they themselves create or develop.