⚖️ Section 230 🗣️ First Amendment 🤖 AI Liability
Speech Regulation / Platform Autonomy; Compelled Speech / Forced Hosting

Netchoice v. Wilson

🏛 United States District Court for the District of South Carolina, Columbia Division · 📅 2026-02-09 · NetChoice member websites and platforms (social media platforms, content-sharing services, and other covered online services)

Issue

Whether South Carolina's Age-Appropriate Design Code Act violates the First Amendment by imposing content-based restrictions requiring websites to "exercise reasonable care" to prevent harms to minors, mandating specific design features and controls, prohibiting facilitation of certain commercial speech, and compelling submission to third-party audits and public reporting.

What Happened

NetChoice filed a facial First Amendment challenge to South Carolina's Age-Appropriate Design Code Act, which requires covered online services to prevent vaguely-defined harms to minors such as "compulsive usage," "severe emotional distress," and "highly offensive" privacy intrusions through content moderation and design choices. The complaint argues the Act imposes content-based speech restrictions by requiring websites to make editorial judgments about lawful expression, compels them to suppress or downrank protected speech, mandates design features that alter content presentation for both minors and adults, imposes strict liability for facilitating lawful commercial speech for age-restricted products, and forces websites to undergo third-party audits and publish reports to the Attorney General. NetChoice invokes *Moody v. NetChoice*, *Reno v. ACLU*, *Miami Herald v. Tornillo*, and Fourth Circuit precedent (*M.P. v. Meta Platforms*) to argue that the Act unconstitutionally regulates editorial discretion and compels speech. This is a complaint seeking declaratory and injunctive relief; no court ruling has issued yet.

Why It Matters

This case represents the next generation of state attempts to regulate social media platforms' content curation and design practices under the guise of child safety, testing the boundaries established in *Moody v. NetChoice*. The South Carolina statute's "duty of care" framework attempts to impose tort liability for editorial choices that cause specified harms to minors, directly implicating the question left open in *Moody* about whether content-neutral design regulation can avoid First Amendment scrutiny—and whether framing speech restrictions as product safety obligations evades constitutional protection for platform editorial judgment.

Related Filings

Other proceedings in the same litigation tracked by this monitor.