California's SB 1047 Advances: Frontier Model Liability on the Brink

SB 1047 — Senator Scott Wiener's Safe and Secure Innovation for Frontier Artificial Intelligence Models Act — cleared the California Senate in May and has now moved through the Assembly Privacy and Consumer Protection Committee with substantial amendments. It is plausibly going to land on the Governor's desk by the end of August. The amended version is meaningfully different from what most early commentary described, so it is worth re-reading.

What the bill actually does

SB 1047 imposes obligations on developers of "covered models," which the amended text defines using a dual threshold: models trained using more than 10^26 integer or floating-point operations, the cost of which exceeds $100 million. (Both prongs must be met.) That threshold puts the bill above the Biden Executive Order's 10^26 threshold by virtue of the dollar gate, and is meaningfully higher than the EU AI Act's 10^25 trigger for systemic-risk GPAI models.

For covered model developers, the bill imposes:

Enforcement is by the California Attorney General. The bill creates civil penalties scaled to model training cost (up to 10% on first violation, 30% on subsequent) and authorizes injunctive relief.

The amendments that matter

The Assembly committee version stripped two of the most-criticized provisions from the Senate-passed bill. Gone is the proposed Frontier Model Division — a new state agency that would have had standing-up authority to license model deployment. Gone too is the perjury exposure for false safety attestations, replaced with a more conventional civil penalty structure.

What remains controversial:

Industry positioning

The major AI labs have split visibly. Anthropic's public letter on the bill, sent in late July, was a "support if amended" — endorsing the safety frameworks while pushing for narrower scope and a clearer pre-harm vs. post-harm enforcement distinction. OpenAI has opposed the bill outright, citing federal preemption concerns and arguing that frontier-model regulation should sit at the federal level. Google and Meta are publicly opposed; xAI has been quiet.

The venture capital community has been louder than the labs themselves. Andreessen Horowitz and Y Combinator have both organized opposition campaigns. Their argument — that the bill's compliance costs will entrench incumbents and harm open-source — has resonated with some California legislators, including a notable letter from eight Democratic House members urging Governor Newsom to veto.

The constitutional questions waiting in the wings

Two First Amendment / preemption issues will be raised if the bill passes:

  1. Dormant Commerce Clause. SB 1047 reaches developers wherever situated; a model trained in Texas and deployed nationwide is covered if it touches California users. Industry plaintiffs will argue this regulates conduct outside California in violation of the dormant Commerce Clause. The state will respond that it regulates only the in-state effect and will cite recent decisions like Ross (2023) for the proposition that states can impose costly compliance regimes that incidentally affect out-of-state actors.
  2. Preemption. No federal AI safety statute exists today, so there is nothing express to preempt against. But field preemption arguments could be built from NHTSA, FAA, or even general federal interstate-commerce regimes if those agencies issue AI-specific rules. None of this is ripe yet, but it will be once the federal landscape moves.

What we are watching

The Assembly floor vote, then Newsom's calculus. Newsom has a long record of signing tech-skeptical legislation but has also been visibly supportive of California's AI industry. He vetoed two AI bills in 2023 on similar grounds (regulate harms, not technology) — a signal worth taking seriously here.

If signed, expect a federal court challenge by mid-2025 and a long compliance buildout. If vetoed, expect Senator Wiener to bring it back with modest revisions in 2025. Either way, SB 1047 has succeeded in setting the terms of debate for what frontier-model regulation looks like in the U.S. — a debate that will spread to other states regardless of California's outcome.