Section 230 of the Communications Decency Act, 47 U.S. Code § 230, has played a critical role in shaping the modern internet, but the boundaries of its protections remain a persistent point of discussion in law and policy. Ongoing litigation is testing just how far Section 230 stretches when it comes to the design and operation of today’s social media platforms.

Section 230 Overview

Section 230 is a key part of U.S. internet law, and many consider it to have created the internet as we know it today. It provides that online service providers are not considered the publisher or speaker of content posted by users and it generally insulates platforms from legal responsibility for third-party content. For example, if someone uploads a misleading or offensive post to Instagram, Instagram itself is generally not held legally responsible for what that user shares because Section 230 protects the platform from being treated as the content publisher. This immunity has encouraged the growth of the internet economy, allowing platforms to innovate without facing publisher-level liability for every user post or comment.

However, Section 230’s protections are not all-encompassing. They do not extend to liability arising from a company’s own actions, such as product design decisions, business practices, or forms of direct engagement with users. The statute also contains explicit carve-outs for areas like federal criminal law and intellectual property disputes.

Meta Litigation – User Content or Product Design?

Although Section 230 is widely perceived to provide broad protections to online platforms, recent proceedings in federal court in California offer a reminder that the scope of Section 230 immunity is not without bounds. In multidistrict litigation brought against Meta Platforms, Inc., a coalition of state attorneys general is challenging Meta’s reliance on Section 230 in response to allegations that Meta’s platform features such as algorithms, infinite scroll, and notifications contribute to adolescent compulsive use and other harmful mental health impacts. The action is now pending in the Circuit Court of Appeals for the Ninth Circuit. California v. Meta, Inc., No. 24-7032.

The states’ reply brief (filed October 14, 2025) draws a distinction between user-generated content and product design. The attorneys general assert that their claims against the technology company focus solely on Meta’s design choices, not the substance of third-party content. From this perspective, the claims address Meta’s actions as a product designer, not as a publisher, and thus the attorneys general maintain that these assertions fall outside Section 230’s protections. In turn, Meta argues that these platform features are intertwined with content moderation decisions and therefore are covered by Section 230.

This dispute highlights a trend: regulators and plaintiffs are increasingly scrutinizing the architecture and user experience of online platforms, moving beyond content moderation as the primary legal battleground. The Court of Appeals’ eventual decision may clarify whether Section 230 shields platforms from liability related to choices in core design, particularly when those choices are allegedly harming young or otherwise vulnerable users.

Takeaways for Online Platforms

Businesses should not assume that Section 230 offers a blanket defense against all forms of internet liability. Where cases turn on design choices or other conduct by the platform, rather than on the role of publishing third-party content, Section 230 may not apply. Organizations should also periodically review their features with attention to potential risks, especially those aimed at high engagement among minors or other vulnerable user populations. As the contours of Section 230 continue to evolve, staying attuned to these developments is increasingly important not only for legal compliance, but also for ethical and responsible product design.