Young woman using smart phone,Social media concept.

Social Media Networks’ Section 230 Immunity on the Chopping Block? New York Court Allows Claims to Proceed Stemming from Buffalo Shooting

Social Media Networks’ Section 230 Immunity on the Chopping Block? New York Court Allows Claims to Proceed Stemming from Buffalo Shooting

April 1, 2024

Social Media Networks’ Section 230 Immunity on the Chopping Block? New York Court Allows Claims to Proceed Stemming from Buffalo Shooting

By: Michelle Cohen

Since 1996, Internet platforms and social media companies have relied on a federal law, Section 230 of the Communications Decency Act, to protect them from liability for civil law claims stemming from content on their platforms. As the influence of platforms like Facebook, Twitter (now X), and others has grown, members of Congress, consumer groups, and other stakeholders have urged Congress to restrict or repeal the networks’ immunity under Section 230, without success. However, a recent ruling on a motion to dismiss by Judge Paula Feroleto in Erie County, New York demonstrates that courts may be willing to consider piercing the Section 230 veil.

Section 230 Background

In 1996, Congress enacted Section 230 (as part of the revisions to the Communications Act) to promote the growth of the Internet and to address inconsistent judicial decisions involving internet platforms (where some were found liable for others’ content and others were not). In summary, Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” and that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconstant with this section.”[1]  Put simply, website “hosts” are generally not liable for publishing material that another party provides.

Since 1996, social media sites such as Facebook, Reddit, Twitter (now “X”) and Yelp have successfully used the Section 230 defense against a host of lawsuits. The platforms have defeated several types of claims, including defamation, the sale of defective products, and physical abuse stemming from relationships made through a social network. While some courts have allowed suits to move forward (such as the Ninth Circuit in Fair Hous. Council v. Roommates.com, LLC,[2] the majority of courts have granted motions to dismiss based on Section 230 immunity.

 Tragedy in Buffalo/Victims Bring Product Liability Claims

On May 14, 2022, Payton Gendron, an 18 year old white supremacist, shot and killed 10 individuals at a Tops Friendly Markets supermarket on the East Side of Buffalo, New York, and wounded three others. Subsequently, surviving victims and the estates of some of the deceased individuals sued various parties, including several social media platforms, including Meta (Facebook), Snap, YouTube, and Reddit. As expected, the social media platforms filed motions to dismiss, asserting Section 230 and the First Amendment to the U.S. Constitution.

The Buffalo victims asserted multiple claims against the social media platforms, including strict products liability (for defective design and failure to warn), negligence, invasion of privacy, negligent and intentional infliction of emotional distress. The plaintiffs contended that Gendron’s acts were motivated by “white replacement theory” (a conspiracy theory that contends there is a plot to diminish the influence of white people), which Gendron allegedly learned from posts on the social networks’ platforms. While the plaintiffs acknowledged Section 230’s protections, the asserted that the platforms are “negligently, defectively, and harmfully designed ‘products’ that drove Gendron to the materials and they are therefore liable based on product liability theories.”[3]

The plaintiff based their piercing of the Section 230 shield on the theory that the social media platforms are “more than just message boards containing third-party content.”[4] Rather, the plaintiffs contend that the platforms are “sophisticated products designed to be addictive to young users and they specifically directed Gendron to further platforms or postings that indoctrinated him with ‘white replacement theory.’”[5] In other words, the plaintiffs focused on the platforms as “products” rather than an “interactive computer service” required for immunity under Section 230.

The Court’s Standard of Review Under Section 230

Following New York case precedent, the court followed a three part test for determining immunity from state law liability under Section 230: specifically, the court look to whether “(1) the defendant is a provider or user of an interactive computer service; (2) the complaint seeks to hold the defendant liable as a publisher or speaker; and (3) the action is based on information given by another information content provider.”[6]  Judge Feroleto focused on first factor – the issue of whether the platforms could constitute a product rather than an interactive computer service. Reviewing a motion to dismiss, the court assumed as true he facts alleged in the complaint.

As the plaintiffs contended the platforms constituted “products” under New York law, the court acknowledged that New York courts do not limit “products” to tangible chattel. In fact, New York “has expressly rejected a bright-line rules for the application of product liability law.”[7] Courts instead look to a few factors, including “(1) a defendant’s control over the design and standardization of the product; (2) the party responsible for placing the product into the stream of commerce and deriving a financial benefit; and (3) a party’s superior ability to know – and warn about – the dangers inherent in the products reasonably foreseeable uses or misuses.”[8]

Motion to Dismiss Denied

Applying a motion to dismiss standard, the court held that the plaintiffs set forth sufficient facts to allege viable causes of action under product liability theories. Judge Feroleto did not review in depth the product liability factors at the motion to dismiss stage. The court also noted that, following discovery, the social media networks may be in a position to raise similar arguments to defeat some or all of the claims in summary judgement motions. [9]

The product liability claims are novel, but creating this type of liability would essentially defeat the purpose of Section 230 and expand liability for a wide range of third party actions.  In fact, U.S. Supreme Court has rejected parties’ attempts to hold platforms liable for third party actions, including violent acts.  Last year, in Twitter v. Taamneh,[10] the U.S. Supreme Court unanimously rejected a lawsuit seeking to hold Twitter, Google and Facebook responsible for an ISIS nightclub attack in Turkey in 2017 allegedly resulting from recruiting videos posted on their sites. The Court focused its ruling on  whether platforms could be liable for “aiding and abetting” a designated foreign terrorist organization under Section 2333 of the Antiterrorism and Effective Death Penalty Act of 1996 (not Section 230), but nevertheless the Court indicated a strong reluctance to hold a platform responsible for third party content.

Certainly, there is offensive and dangerous content on social media platforms. But, to transform these platforms into products and then seek liability for a third party’s criminal actions is certainly not what Congress intended when it enacted Section 230. It is a huge stretch to say that the social media networks created a “product” that caused a white supremacist to gun down over a dozen individuals. It seems likely that many of the claims in the Patterson case will be dismissed following the summary judgment phase.

[1] 47 U.S.C. § 230.

[2] 521 F.3d 1157 (9th Cir. 2008) (Section 230 did not bar housing discrimination claims against a roommate-matching website that required users to answer questions about themselves, including gender, sexual orientation, and children at home, and their housing preferences based on these criteria. The court held “a website helps to develop unlawful content, and thus falls within the exception to Section 230, if it contributes materially to the alleged illegality of the content.”). Id. at 1168.

[3] Diona Patterson et al v. META PLATFORMS, INC., et al, Index No. 805896/2023, Decision and Order, NYSCEF Doc. 404 at 3.

[4] Index No. 805896/2023, Decision and Order, NYSCEF Doc. 404 at 4.

[5]  Id. (emphasis added).

[6] Index No. 805896/2023, Decision and Order, NYSCEF Doc. 404 at 5, quoting Shiamili v. Real Estate Group of New York, Inc., 17 NY3d 281k 286-87 (2011).

[7] Id.

[8] Index No. 805896/2023, Decision and Order, NYSCEF Doc. 404 at 6.

[9] The court also held that the issue of proximate cause between the criminal actions of the shooter and his social media platform uses is a question of fact for a jury to decide. Id. at 9.

[10] Twitter, Inc. v. Taamneh, 598 U.S. 471 (2023)

Michelle Cohen

Michelle Cohen

At Ifrah Law, Michelle’s practice focuses on helping clients establish powerful and enduring relationships with their customers and prospects while remaining compliant with state and federal law governing privacy and advertising laws and regulations.

Appeals Court Tells Elon Musk –A Deal Is A Deal: SEC Consent Decree Sticks
May 18, 2023

Appeals Court Tells Elon Musk –A Deal Is A Deal: SEC Consent Decree Sticks

By: Michelle Cohen
For the Children!: Children’s Online Safety Becomes Focus of State and Federal Law
May 15, 2023

For the Children!: Children’s Online Safety Becomes Focus of State and Federal Law

By: Nicole Kardell
A New Paradigm: Claimant Opposition to Mass Tort Bankruptcy and Needed Reform
May 3, 2023

A New Paradigm: Claimant Opposition to Mass Tort Bankruptcy and Needed Reform

By: George Calhoun
Pump The Brakes or Step on the Gas? An Analysis of Emerging AI Regulatory Frameworks
Apr 6, 2023

Pump The Brakes or Step on the Gas? An Analysis of Emerging AI Regulatory Frameworks

By: Abbey Block

Subscribe to Ifrah Law’s Insights