Breaking News
CASE PREVIEW

Justices will consider whether tech giants can be sued for allegedly aiding ISIS terrorism

Office building with "@twitter" sign

In 2015, ISIS conducted a series of coordinated attacks around Paris that killed 130 people and wounded nearly 500 more. Two years later, 39 people were killed in an ISIS attack on an Istanbul nightclub during the early hours of New Year’s Day. This week, the Supreme Court will hear oral arguments in a pair of cases arising from the attacks. The justices’ decisions in Gonzalez v. Google and Twitter v. Taamneh could reshape legal liability for some of the nation’s largest technology companies.

Gonzalez v. Google

The question at the center of Gonzalez, which will be argued on Tuesday, is the scope of Section 230 of the Communications Decency Act of 1996, which generally shields tech companies from liability for content published by others. The justices will consider whether that landmark statute protects internet platforms when their algorithms target users and recommend someone else’s content.

The question comes to the court in a lawsuit filed by the family of Nohemi Gonzalez, a 23-year-old American woman who was killed in the 2015 ISIS attack on a Parisian bistro, La Belle Équipe. They brought their lawsuit under the Antiterrorism Act, arguing that Google (which owns YouTube) aided ISIS’s recruitment by allowing ISIS to post videos on YouTube that incited violence and sought to recruit potential ISIS members, and by recommending ISIS videos to users through its algorithms.

A divided panel of the U.S. Court of Appeals for the 9th Circuit ruled that Section 230 protects such recommendations, at least if the provider’s algorithm treated content on its website similarly. The majority acknowledged that Section 230 “shelters more activity than Congress envisioned it would.” However, the majority concluded, Congress – rather than the courts – should clarify how broadly Section 230 applies. The Gonzalez family then went to the Supreme Court, which agreed last year to weigh in.

In the Supreme Court, the Gonzalez family insists that recommendations are not always shielded from liability under Section 230. Whether they are protected, the family says, hinges on whether the defendant can meet all of the criteria outlined in Section 230, which bars providers of “an interactive computer service” from being “treated as the publisher … of any information provided by” a third party. For example, the family argues, Section 230 does not protect a defendant from liability for recommendations that contain material that the defendant itself created or provided, such as URLs for the user to download or “notifications of new postings the defendant hopes the user will find interesting,” because in that scenario, the information would not be provided by someone else.  

A website like YouTube is also not shielded from liability, the family continues, when it provides unsolicited recommendations that it thinks will appeal to users. In that scenario, the family asserts, the defendant is not providing access to a computer server (because the user is not making a request) and therefore is not acting as a “provider … of an interactive computer service.”

Because Section 230 does not always provide tech companies with immunity for their recommendations, the family concludes, the 9th Circuit should not have thrown out the family’s claim. But, the family stresses, even if Google is not entitled to immunity under Section 230, that is only the beginning of the inquiry: The family must then show that Google can be held liable under federal antiterror laws for its recommendations.

The Biden administration agrees with the Gonzalez family that the court of appeals was wrong to dismiss its claim based on YouTube’s recommendations of ISIS content, but its reasoning focuses only on how YouTube’s algorithms operate and on their effect. YouTube’s suggested videos, the administration notes, appear on the side of each user’s YouTube page and will “automatically load and play when a selected video ends.” In so doing, the administration explains, YouTube “implicitly tells the user that she ‘will be interested in’” the content of that video – which is a separate message from the message in the video itself. Therefore, the administration concludes, although the family may ultimately “face obstacles” in proving their claims under the ATA, Google and YouTube are not entitled to immunity under Section 230 because the family is seeking “to hold YouTube liable for its own conduct and its own communications, above and beyond its failure to block ISIS videos or remove them from the site.”

In their brief on the merits, Google and YouTube condemn terrorism and emphasize that they have taken “increasingly effective actions to remove terrorist and other potentially harmful conduct.” But Section 230 bars the family’s claims against them for YouTube’s recommendation of ISIS-related videos, they maintain, because the provision provides immunity from claims that treat the defendant as a publisher. And just as a newspaper acts as a publisher when it puts together an opinion page filled with essays and columns written by other people, the companies write, YouTube acts as a publisher when its algorithms “sort and list related videos that may interest viewers so that they do not confront a morass of billions of unsorted videos.”

Google and YouTube urge the justices not to “undercut a central building block of the modern internet.” If Section 230 does not protect YouTube’s efforts to organize the videos that others post on its site, they caution, neither Gonzalez nor the Biden administration have a “coherent theory that would save search recommendations and other basic software tools that organize an otherwise unnavigable flood of websites, videos, comments, messages, product listings, files, and other information.”

Google and YouTube offer the justices an off-ramp, noting that the Gonzalez family’s claims in this case are “materially identical” to the claims in Twitter v. Taamneh, which will be argued on Wednesday. If the court were to rule that the Taamneh family’s claim cannot go forward under the ATA, the tech companies tell the justices, then the Gonzalez family’s claims also cannot go forward, so there would be no need for the justices to decide whether Google and YouTube are shielded from liability under Section 230.

Twitter v. Taamneh

In the Twitter case, the justices agreed to decide whether Twitter (along with Facebook and Google, which were also defendants in the lower courts) can be held liable, regardless of Section 230, for aiding and abetting international terrorism based on ISIS’s use of the companies’ platforms.

The lawsuit was filed by the family of Nawras Alassaf, a Jordanian citizen who was among the 39 people killed in the January 2017 ISIS attack at the Reina nightclub in Istanbul. The Taamneh family filed a lawsuit in federal court in California under the Antiterrorism Act, which allows U.S. nationals to sue anyone who “aids and abets, by knowingly providing substantial assistance,” international terrorism. The family contended that Twitter and the other tech companies knew that their platforms played an important role in ISIS’s terrorism efforts but, despite extensive press coverage and government pressure, did not act aggressively to keep ISIS content off those platforms.

The 9th Circuit allowed the Taamneh family’s aiding-and-abetting claim to go forward. It acknowledged that the tech companies’ policies bar users from posting content that promotes terrorism, and that the tech companies regularly removed posts with ISIS-related content. And although it stressed that “[n]ot every transaction with a designated terrorist organization will sufficiently state a claim for aiding-and-abetting liability under the ATA,” it concluded that the Taamneh family had done so in this case. Twitter went to the Supreme Court, which agreed last year to weigh in.

In the Supreme Court, Twitter urges the justices to overturn the 9th Circuit’s ruling. The company argues that a defendant can only be held liable under the ATA, as amended by the Justice Against Sponsors of Terrorism Act, when it has provided substantial assistance for a specific act of international terrorism – such as the attack on the Reina nightclub. But the plaintiffs have not even alleged that the terrorists responsible for the Reina attack ever used Twitter.

Twitter’s actions also fell short of the kind of “knowing” assistance required for liability under the ATA, the company says. It is not enough that Twitter knew that terrorists used its platforms, even though Twitter’s policies barred them from doing so. Instead, Twitter argues, it can only be held liable if it knew about “specific accounts that substantially assisted the Reina attack” and knew “that not blocking those accounts would substantially assist such an attack.” But, it stresses, the plaintiffs concede that Twitter “rarely knew about specific terrorist accounts or posts,” and they do not allege that Twitter “knew about yet failed to block any account or post that was used to plan or commit the Reina attack or any other terrorist attack.”

The Biden administration agrees that the 9th Circuit’s decision should not stand, but it takes a slightly different (and broader) view of liability than Twitter. In its view, a defendant could in some circumstances be held liable under the ATA even when it did not specifically know about the terrorist attack that led to a victim’s injury, or if it did not provide support for that act. But, the government adds, plaintiffs must allege more than that the defendants have simply provided “generalized support to a terrorist organization through the provision of widely available services” – and the Taamneh family has not done so in this case.

The Taamneh family counters that the ATA was intended to provide plaintiffs with “the broadest possible basis” to sue companies and organizations that provide assistance to terrorist organizations. And the text of the ATA, the family says, makes clear that it does not require a connection between the assistance that the defendant provides and a specific terrorist attack: It is enough that the defendant provided assistance to the broader terrorist organization. “Twitter’s proposed interpretation of” the ATA, the family writes, “would implausibly segregate a particular terrorist act from the overall campaign of terror of which it was an integral part, requiring courts to ignore the often long chain of events which enabled a foreign terrorist organization to mount such an attack.”

Both sides warn of dire consequences if the other side prevails. Twitter suggests that the family’s theory could create a “novel and boundless conception of aiding-and-abetting liability” that could expose aid organizations and NGOs to liability if they provide assistance that eventually reaches and assists ISIS’s general operations, even if there is no connection to a specific terrorist attack.

Facebook and Google echo Twitter’s concerns. They tell the justices that a ruling for the family could mean that social-media companies could be sued under the ATA “for virtually any terrorist attack ISIS ever commits, at any time and anywhere in the world, simply because their efforts to prevent ISIS members or supporters from exploiting their services were not, in a jury’s estimation, sufficiently ‘aggressive.’” That liability, they continue, could extend to a wide range of other companies whose products or services could be used by terrorists.

But the Taamneh family says that Twitter’s construction of the law would be so narrow that it would be almost useless: It would only apply, for example, “to a fellow terrorist who handed a killer a firearm” and “could not as a practical matter be applied to the types of outside assistance that most matters to terrorist organizations, such as contributions, banking services, and social media recommendations.” Twitter’s theory, the family posits, would also “require a type of knowledge which almost no one but a terrorist would usually possess.” 

Even as the justices grapple with the weighty questions in the Google and Twitter cases, they are also aware that another pair of cases involving social-media companies is lurking on the horizon. In January, the justices asked the Biden administration for its views on the challenges to controversial laws, enacted in Florida and Texas, that seek to regulate the content-moderation policies of social-media companies like Facebook and Twitter. Both laws were passed in response to beliefs that social-media companies were censoring their users, particularly those expressing conservative beliefs. If, as expected, the Florida and Texas cases eventually return to the Supreme Court, the court’s rulings could create a conundrum for tech companies: A decision that curtails Section 230 could require tech companies to remove content in order to avoid expanded legal liability, while the Texas and Florida laws could restrict the companies’ ability to do so.

This article was originally published at Howe on the Court.

Recommended Citation: Amy Howe, Justices will consider whether tech giants can be sued for allegedly aiding ISIS terrorism, SCOTUSblog (Feb. 19, 2023, 2:51 PM), https://www.scotusblog.com/2023/02/justices-will-consider-whether-tech-giants-can-be-sued-for-allegedly-aiding-isis-terrorism/