California’s AB 2: Finally Putting Social Media on the Hook for Hurting Kids

Anya Dalal

For years, social media giants have hidden behind the claim that they are just “neutral platforms.” They rake in billions by designing feeds and algorithms meant to keep kids glued to screens, even as rates of anxiety, depression, and self-harm among adolescents skyrocket. Parents, schools, and healthcare systems have been left to clean up the damage, while Silicon Valley cashes the checks. California’s Assembly Bill 2 is one of the first serious attempts to call their bluff.

AB 2 would make big platforms (those pulling in more than $100 million a year) legally liable when their negligence causes harm to children. Sponsored by Common Sense Media, the bill passed the California State Assembly unanimously in May 2025 and awaits its vote in the California Senate. This bill creates a legal pathway for victims to receive damages if a platform's design or algorithmic choices lead to harm to a minor. If a company fails to exercise “ordinary care” in its design choices, the penalties are enormous: up to $1 million per child, or three times the actual damages. In other words, no more hiding behind fine print or waving off responsibility. If a child is harmed because a platform prioritizes attention over safety, the company should pay.

This is exactly the kind of pressure the industry has spent years trying to avoid. Internal research has shown that certain product features worsen body image issues, encourage compulsive use, and expose kids to toxic content, yet companies roll them out anyway because they’re profitable. A lawsuit carrying million-dollar stakes for every injured child suddenly changes the calculus.

Of course, the tech lobby is already sharpening its knives. They’ll argue that curating a feed is “free speech,” that Section 230 protects them from liability, and that it’s too hard to prove harm. That’s the point: they’ve built their empire on legal shields and regulatory gray zones. AB 2 forces a new conversation regarding whether the First Amendment should really cover manipulative design tricks aimed at thirteen-year-olds.

Yes, proving causation in court will be difficult. And yes, some lawsuits may be messy. But that’s true of any fight against powerful industries that profit from harm—think Big Tobacco or Big Pharma. For years, tech companies have successfully cast themselves as too innovative to regulate and too complex to hold accountable. AB 2 tears at that myth.

Critics will also point out that the bill exempts smaller apps or that it might unleash frivolous litigation. But the reality is that the biggest platforms are the ones creating real damage. Targeting them first is both practical and overdue.

California has been here before. The state led on tobacco control (“Smoke-Free Workplace Act” [AB13] in 1995), auto emissions (“Pavley Bill” [AB 1493] in 2002), and privacy law (California Consumer Privacy Act in 2018). Each time, the industry claimed the sky would fall. Each time, companies eventually adapted, and the public benefited. AB 2 is another test of whether we’re willing to hold profit-driven corporations accountable when they knowingly harm kids.

Even if AB 2 is watered down in the Senate or struck down in court, the message is clear: the era of platforms profiting from addictive, unsafe designs while dodging responsibility is ending. The only question is whether lawmakers and courts have the backbone to finish the job.

Image Credit: Cleveland Clinic

Next
Next

When Words Fail: The Assassination of Charlie Kirk and the Need for Deliberative Democracy