California Governor’s Veto of Controversial AI Safety Bill Sparks Debate Over Technology Regulation and Innovation

California Governor Vetoes Controversial AI Safety Bill

California Governor Gavin Newsom has officially vetoed SB-1047, widely recognized as a contentious piece of legislation aimed at regulating artificial intelligence (AI) models. The bill sought to establish safety tests and mandatory kill switches for the developers of large AI systems to mitigate potential significant harms.

In a statement issued alongside the veto, Governor Newsom expressed concern that the legislation misdirected its focus exclusively on larger models, suggesting this approach could foster a false sense of security regarding a technology that is evolving rapidly. He emphasized that smaller, specialized AI models may pose equally, if not greater, risks and warned that the bill could stifle innovation by placing undue restrictions on developers.

Newsom identified several “rapidly evolving risks” associated with AI usage that require more nuanced regulation, including threats to democratic processes, the proliferation of misinformation and deepfakes, and other critical concerns regarding online privacy and workforce stability. Furthermore, California has already enacted a suite of AI laws tailored to address various potential harms, and other states have initiated similar legislative efforts.

“In its current form, SB-1047 fails to consider whether an AI system operates in high-risk environments or handles sensitive information,” Newsom elaborated. “It applies stringent standards even to basic functions, which could lead to ineffective public protection from genuine risks posed by AI technology.”

State Senator Scott Wiener, who co-authored SB-1047, labeled the veto a “setback” for advocates seeking oversight over major corporations wielding significant influence over public welfare and safety. He criticized the reliance on voluntary safety measures by AI companies, arguing that without effective regulation, the public remains at heightened risk.

A CONTENTIOUS LOBBYING EFFORT

SB-1047 garnered support from notable figures in the AI community, such as pioneer Geoffrey Hinton and expert Yoshua Bengio. However, it faced substantial pushback from various industry stakeholders, concerned about the bill’s heavy-handed approach and the legal liabilities imposed on open-weight models, which could inadvertently enable misuse.

Following the Assembly’s passage of the bill in August, a coalition of California business leaders penned an open letter urging Newsom to veto what they characterized as a “fundamentally flawed” piece of legislation that prioritized model regulation over misuse and threatened excessive compliance costs.

Prominent tech companies, including Google and Meta, opposed the measure, albeit with some employees advocating for it. OpenAI’s Chief Strategy Officer, Jason Kwon, recommended against the bill, positing that federal regulation would yield more coherent oversight than a patchwork of state laws. Talks of federal AI legislation remain stalled amid other political priorities.

Advocates like Elon Musk supported SB-1047, arguing that regulation of AI technology is necessary, paralleling the oversight applied to products that potentially endanger public safety. The influential actors’ union, SAG-AFTRA, also backed the bill, presenting it as a vital first step toward safeguarding individuals against dangers such as deepfakes.

During a recent address at the 2024 Dreamforce conference, Governor Newsom remarked on the potential chilling effects of legislation like SB-1047 on the open-source community, signaling a need for balanced approaches that both promote innovation and protect public welfare.


This article provides an overview of the current landscape surrounding AI regulation in California, highlighting the ongoing tensions between technology oversight and the imperative to foster innovation in an increasingly complex field.

Leave a Reply

Your email address will not be published. Required fields are marked *

Translate »