Browse Cases

216 results
2022-10-06 · Other

Why It Matters: The breadth and specificity of the exhibit list signals that plaintiffs intend to prove at trial that Meta possessed extensive internal knowledge of harms its platforms caused to adolescent users, which could be significant for establishing the knowledge and design-defect elements of product liability claims that courts in this MDL have allowed to proceed notwithstanding Section 230 immunity arguments.

View on CourtListener →
2022-10-06 · Other

Why It Matters: This document is significant because it reveals how §230 and First Amendment protections will be operationalized at the jury instruction level in the first bellwether trial of a major social media addiction MDL, effectively showing which platform design features a court has already ruled immune from tort liability; the outcome could establish a concrete, feature-by-feature framework for distinguishing actionable product design claims from immunized publishing decisions that other courts and litigants could adopt or contest in future platform liability litigation.

View on CourtListener →
2022-10-06 · Other

Why It Matters: The motion presents a significant question about whether Section 230 immunity can be invoked not only to defeat substantive liability claims but also to exclude expert damages methodologies that treat a platform's publication of third-party content as the predicate "violation" for penalty calculation purposes, potentially extending §230's reach into the evidentiary phase of litigation. If the court grants exclusion on this ground, it would signal that plaintiffs in platform-liability cases must carefully disaggregate algorithmic and design conduct from publishing conduct even at the damages-quantification stage.

View on CourtListener →
2022-10-06 · Other

Why It Matters: This reply brief illustrates how the §230 immunity question is migrating from the pleadings and summary judgment stages into trial-management rulings, testing whether the court's prior "feature-by-feature" liability framework can be operationalized as an evidentiary filter; the outcome could establish a replicable in limine standard for separating protected editorial/publishing conduct from actionable product-design claims in platform-liability litigation.

View on CourtListener →
2022-10-06 · Other

Why It Matters: This ruling advances a significant and recurring distinction in platform liability litigation: that Section 230 and the First Amendment operate as liability bars tied to *content-based* claims, not as blanket evidentiary shields against design-defect theories premised on addiction-inducing, content-agnostic features, potentially signaling that state-court juries will hear extensive evidence about algorithmic architecture even where direct liability for that architecture is nominally cabined by prior rulings.

View on CourtListener →
2022-10-06 · Other

Why It Matters: Insufficient text to determine the precise arguments or the court's reasoning, but the existence of a motion in limine framing §230 and the First Amendment as evidentiary shields — rather than pleading-stage defenses — signals that defendants are pursuing these protections through trial to limit what a jury may consider regarding platform content and design features.

View on CourtListener →
Opinion Section 230 Summary Judgment (Reversed)

Lee v. Amazon.com, Inc.

Cal. App. Ct. · 2022-03-17 · Amazon

Issue: Whether Amazon was strictly liable for injuries caused by a defective product sold through the Amazon Marketplace by a third-party seller, consistent with Bolger and Loomis.

Why It Matters: Part of the trilogy of California appellate decisions (Bolger, Loomis, Lee) establishing that Amazon and similar marketplace platforms can face strict products liability in California, regardless of § 230. These cases reflect a significant strand of platform liability doctrine that operates entirely outside the § 230 framework by focusing on the platform's role in commercial transactions rather than its role in hosting user speech.

View on CourtListener →
First Amendment

NETCHOICE LLC v. UTHMEIER

District Court, N.D. Florida · 2 filings
2021-05-27 · Other

Why It Matters: This motion sits at the intersection of two of the most contested questions in platform law: what *Moody v. NetChoice* actually means for state content-regulation statutes, and how courts should evaluate expert testimony about how platform algorithms function. If the court excludes Bapna—particularly on the ground that no real-world platform operates as a pure engagement-maximizer indifferent to content standards—it removes the factual foundation Florida needs to sustain its regulatory theory after *Moody*, and signals how similar evidentiary battles will play out in challenges to comparable laws in other states. Even a narrower ruling grounded solely in methodology would leave open the *Moody* reservation question for the merits, but would deprive defendants of the only expert testimony asserting that algorithmic curation is categorically distinct from protected editorial judgment. For anyone tracking state social-media regulation efforts nationwide, this motion is an early indicator of the evidentiary threshold states will face in constructing post-*Moody* records.

View on CourtListener →
2021-05-27 · Motion for Summary Judgment

Why It Matters: The brief's most consequential — and most legally exposed — move is treating the "dumb pipe" framing as controlling law, when that language appears only in a three-Justice *Moody* concurrence in the judgment, not the majority opinion; if a court accepts it, the result would mark the most significant contraction of First Amendment protection for platform editorial activity in decades. The quasi-facial recharacterization argument is the brief's strongest procedural play, because *Moody*'s substantial-outweighs standard is black-letter law and plaintiffs' post-remand record may not satisfy it. The § 1983 cause-of-action argument, while less prominent in the brief, is doctrinally serious and could foreclose the Section 230 preemption claims entirely without reaching the merits — a clean, narrow path to partial judgment that courts sometimes prefer.

View on CourtListener →
Opinion Section 230 Motion to Dismiss (Reversed)

Lemmon v. Snap, Inc.

9th Cir. · 2021-05-04 · Snap, Inc. (Snapchat)

Issue: Whether § 230(c)(1) bars a negligent design products liability claim against Snap for creating a "Speed Filter" feature that allegedly incentivized users to drive at dangerously high speeds by displaying their real-time speed and rewarding high-speed posts.

Why It Matters: The Ninth Circuit's leading decision establishing that § 230 does not immunize a platform from products liability claims targeting the design of the platform's own features. The design-defect claim targets what the platform built — not what users post — and therefore falls outside § 230's scope. Lemmon is the foundational precedent for the wave of social media design-defect litigation, including cases involving fentanyl trafficking on Snapchat, TikTok's Blackout Challenge, and youth mental health harms from social media features.

View on CourtListener →
Opinion Section 230 Summary Judgment (Reversed in Part)

Loomis v. Amazon.com LLC

Cal. App. Ct. · 2021-04-01 · Amazon

Issue: Whether Amazon could be strictly liable as a seller for defective hoverboards sold by third-party merchants through the Amazon Marketplace, where Amazon took a more passive role in the transaction than it did in Bolger.

Why It Matters: Extended Bolger beyond the FBA context, establishing that Amazon's marketplace model more broadly — not just its fulfillment services — can constitute seller status in California. Clarified that strict products liability in the e-commerce marketplace context turns on the totality of the platform's commercial involvement, not solely on whether it physically handled the product.

View on CourtListener →
Opinion Section 230 Demurrer (Sustained — Affirmed)

Murphy v. Twitter, Inc.

Cal. App. Ct. · 2021-01-26 · Twitter

Issue: Whether § 230 bars a California state civil rights claim against Twitter for suspending a user's account and allegedly discriminating against conservative political viewpoints in its content moderation.

Why It Matters: Applied § 230(c)(2)(A) to defeat a state civil rights challenge to social media content moderation, reinforcing that California's Unruh Act cannot be used to force platforms to reinstate suspended accounts or to impose viewpoint neutrality requirements on editorial decisions. A key case in the debate over whether § 230 forecloses state public accommodations law as a tool for challenging platform moderation.

View on CourtListener →
Opinion Section 230 Summary Judgment (Reversed)

Bolger v. Amazon.com, Inc.

Cal. App. Ct. · 2020-08-13 · Amazon

Issue: Whether Amazon was strictly liable under California products liability law as a seller in the chain of distribution for a defective product sold by a third-party merchant through the Amazon Marketplace.

Why It Matters: A significant products liability decision establishing that marketplace platforms that take an active role in fulfilling consumer transactions can be treated as sellers subject to strict liability, independent of § 230. The case is important in the e-commerce liability context because it does not rest on § 230 — it applies traditional products liability doctrine to Amazon's fulfillment activities. Subsequent California cases (Loomis, Lee) have refined the standard.

View on CourtListener →
Opinion Section 230 Motion to Dismiss (Affirmed)

Dyroff v. The Ultimate Software Grp., Inc.

9th Cir. · 2019-09-16 · WeConnect (The Ultimate Software Group)

Issue: Whether § 230 bars wrongful death claims against an online community platform whose recommendation features allegedly connected a user with the drug dealer who sold him the heroin that killed him.

Why It Matters: Applied § 230 to algorithmic connection-recommendation features, distinguishing the neutral recommendations in WeConnect from the structured, discriminatory questionnaire in Roommates.com. The case illustrates the line between passive publication of connections (protected) and active development of harmful content (not protected), in a context involving serious offline harm from drug dealing facilitated by the platform.

View on CourtListener →
Opinion Section 230 Motion to Dismiss (Reversed in part)

Enigma Software Grp. USA, LLC v. Malwarebytes, Inc.

9th Cir. · 2019-09-12 · Malwarebytes, Inc.

Issue: Whether § 230(c)(2)(B) immunizes a cybersecurity company that blocks a competitor's software products as "potentially unwanted programs," where the plaintiff alleges the blocking was motivated by anticompetitive rather than content-quality reasons.

Why It Matters: The leading case establishing that § 230(c)(2) immunity is not unlimited — the "good faith" requirement has teeth. Filtering tools and platforms cannot invoke § 230(c)(2) when blocking or filtering decisions are driven by anticompetitive motivations rather than genuine content-quality concerns. Important for understanding the limits of the Good Samaritan provision of § 230 in the cybersecurity and software industry context.

View on CourtListener →
Opinion Section 230 Motion to Dismiss (Affirmed)

Force v. Facebook, Inc.

2d Cir. · 2019-07-31 · Facebook

Issue: Whether Facebook was liable under the Anti-Terrorism Act for allegedly providing Hamas with a communications platform, and whether Facebook's content recommendation algorithm constituted independent tortious conduct not shielded by § 230(c)(1).

Why It Matters: The Second Circuit's definitive pre-Gonzalez holding that algorithmic content recommendation is publisher activity protected by § 230(c)(1). Force v. Facebook directly conflicts with the Third Circuit's subsequent Anderson v. TikTok decision, which held that a platform's targeted algorithmic recommendations constitute the platform's own speech. The resulting circuit split on whether recommendation algorithms are publisher functions or independent platform speech is the central unresolved question in § 230 doctrine post-Gonzalez.

View on CourtListener →
Opinion Section 230 Motion to Dismiss (Affirmed)

Marshall's Locksmith Serv. Inc. v. Google LLC

D.C. Cir. · 2019-06-07 · Google, Yelp, IAC (HomeAdvisor)

Issue: Whether § 230 bars claims against search and directory platforms for listing and promoting fraudulent locksmith businesses that deceived customers with bait-and-switch pricing.

Why It Matters: Applied § 230 to a marketplace fraud context, holding that listing businesses and users who later engage in fraud does not strip a platform of publisher immunity. Consistent with the broad reading of § 230 as immunizing platforms for harms traced back to the conduct of third-party users or service providers listed on the platform.

View on CourtListener →
Opinion Section 230 Appeal (Dismissal Affirmed)

Herrick v. Grindr, LLC

2d Cir. · 2019-03-27 · Grindr

Issue: Whether § 230 bars tort claims against Grindr for failing to remove fake profiles and implement safety features, given that a malicious third party used the platform to orchestrate a harassment campaign against the plaintiff.

Why It Matters: The sharpest circuit-level rejection of the Internet Brands framework. The Second Circuit effectively held that a platform's failure to remove user content after notice — even notice of a coordinated offline harm campaign — is publisher activity within § 230(c)(1). The case illustrates the breadth of § 230 immunity in the Second Circuit and the difficulty plaintiffs face in finding a viable legal theory when a platform's inaction causes serious offline harm.

View on CourtListener →
Opinion Section 230 Appeal from dismissal for failure to state a claim (Affirmed)

HomeAway.com, Inc. v. City of Santa Monica

9th Cir. · 2019-03-13 · HomeAway.com, Inc. and Airbnb Inc.

Issue: Whether Santa Monica's short-term rental ordinance — which prohibited hosting platforms from processing booking transactions for unregistered properties — was preempted by § 230(e)(3) or violated the First Amendment.

Why It Matters: An important § 230(e)(3) preemption decision establishing that state and local laws imposing transactional or conduct-based obligations on platforms are not preempted by § 230, even if compliance requires the platform to check information related to user listings. Demonstrates the limits of § 230 preemption as a blanket shield against generally applicable regulatory obligations — the statute preempts liability for publishing third-party content, but not regulation of a platform's own commercial conduct.

View on CourtListener →
Opinion Section 230 Appeal (Removal order against Yelp reversed; 4-3)

Hassell v. Bird

Cal. · 2018-07-02 · Yelp

Issue: Whether a state court can enforce a defamation judgment by ordering Yelp — which was not a party to the underlying lawsuit — to remove reviews posted by the defendant from its platform.

Why It Matters: Established that § 230 preempts state court attempts to conscript non-party platforms into removing content through injunctions entered in litigation to which the platform was not a party. The decision is significant at the intersection of § 230, injunctions, and due process: a plaintiff cannot circumvent § 230 by obtaining a content-removal order against a platform without naming it as a defendant. The case is frequently cited in debates about procedural mechanisms for victims of online defamation to obtain meaningful relief.

View on CourtListener →