One of the biggest debates California lawmakers face in the final week of the legislative session is whether to introduce new safety rules for companies developing artificial intelligence. SB 1047 would require California companies investing at least $100 million in developing AI models to conduct safety tests to prevent significant risk or harm. Experts warn that without guardrails, the models could eventually be used by bad actors to create biological weapons or launch cyberattacks aimed at taking down the power grid or collapsing the banking system. “The exact timing of these threats is unclear, but some of these threats could materialize in as little as a year,” AI researcher Dan Hendrix told reporters during an online press conference on Monday. “Product safety testing is standard across many industries, including manufacturers of automobiles, aircraft, prescription drugs and nuclear power plants.” The bill has frustrated industry players who worry that regulations will slow progress in a growing industry, including OpenAI, the developer of ChatGPT. The company has warned that it could be forced to move its operations out of California if the bill passes. “We understand and are used to this being hardline politics,” said state Sen. Scott Wiener of San Francisco, who authored the bill. “If you try to pass legislation that is in the public interest, industry will threaten to relocate.” The issue has divided Democrats. A group of California House members, including former Speaker Nancy Pelosi, wrote to Gov. Gavin Newsom earlier this month, urging him to vote down the bill if it reaches his desk. “In short, we are very concerned about the impact this bill will have on California's innovation economy. There is no clear benefit to the public,” the group wrote. “High-tech innovation is the economic engine that drives California's prosperity.” Wiener told reporters that “Congress is paralyzed on technology policy,” noting that Congress has not passed any major technology regulations since the 1990s, except for the TikTok ban. “I'm not saying this to bash Congress, but Congress has proven incapable of passing strong technology policy.” Republican state lawmakers are also divided over the bill. Rep. Devon Mathis of Visalia told KCRA 3 he plans to vote in favor of the bill. “How do you build public trust when the people who control regulation are blocking it,” he said. But some lawmakers say they have problems with the bill. “There are things where the government has a role to play in regulation and control,” Rep. Josh Huber of Folsom said. “But my concern with this bill is that it goes too far in that direction before we know what we're dealing with.” The issue is also divided across the tech industry. “Regulation would have devastating consequences for the AI ecosystem,” Yann LeCun, Meta's chief AI scientist, said in an X post. Elon Musk voiced his support for the bill on Monday night. “This is a tough call and it will anger some people, but all things considered, I think California should probably pass SB 1047, the AI Safety Act,” he wrote in an X post. “For over 20 years, I've been an advocate for regulating AI, just as I would regulate any product or technology that poses a potential risk to the public.” Governor Gavin Newsom has not publicly stated his position on the bill. “We have an advantage in this space, and we want to continue to have an advantage in this space. We're not going to cede this space to other states or other countries,” he said at an AI summit in May. “If we over-regulate, if we over-complacency, if we chase what's hot, we put ourselves in a dangerous position. But at the same time, we have an obligation to lead.” State lawmakers are expected to vote on the bill later this week. If approved, it would then have to go back to the Senate for a vote to approve any changes made while the bill was in the Assembly. Lawmakers have until midnight Saturday to pass the new law this year. The governor has until Sept. 30 to decide whether to sign or veto it. Read more California's top news here | Download our app | Subscribe to our morning newsletter
SACRAMENTO, Calif. —
One of the biggest debates California lawmakers face in the final weeks of the legislative session is whether to impose new safety rules on companies developing artificial intelligence.
SB 1047 would require California companies spending at least $100 million developing AI models to conduct safety tests to prevent significant risk or harm. Experts warn that without guardrails, the models could eventually help bad actors create biological weapons or launch cyberattacks aimed at taking down the power grid or collapsing the banking system.
“While the exact timing of these threats is unclear, some of these threats could become reality within the next year,” AI researcher Dan Hendrix told reporters during an online press conference on Monday. “Product safety testing is standard across many industries, including manufacturers of automobiles, airplanes, prescription drugs, nuclear power plants and more.”
The bill has frustrated industry players who worry that regulation will slow progress in a growing industry, including ChatGPT developer OpenAI, which has warned it may be forced to relocate its operations out of California if the bill passes.
“I understand and I'm used to this being hardline politics,” said state Sen. Scott Wiener, D-San Francisco, who wrote the bill. “Any time you try to pass a law in the public interest, industry will threaten to act.”
The issue has divided Democrats, and a group of House members from California, including former Speaker Nancy Pelosi, wrote Gov. Gavin Newsom earlier this month urging him to veto the bill if it reaches his desk.
“In short, we are very concerned about the impact this bill will have on California's innovation economy, and it would not deliver clear benefits to Californians,” the groups wrote. “High-tech innovation is the economic engine that drives California's prosperity.”
“Congress is paralyzed on technology policy,” Weiner told reporters, noting that with the exception of the TikTok ban, Congress has not passed any major technology regulation since the 1990s. “I don't say this to bash Congress, but Congress has proven incapable of passing strong technology policy.”
Republican state lawmakers are also divided on the bill.
Republican Rep. Devon Mathis of Visalia told KCRA 3 he plans to vote in favor of the bill. “How can we build public trust when the ones who control regulation are the ones obstructing regulation,” he said.
But some say there are problems with the bill.
“There's a role for government to play in regulation and control,” said Rep. Josh Huber, a Republican from Folsom, “but my concern with this bill is that it goes too far in that direction before we know what we're dealing with.”
The issue has also divided the tech industry as a whole.
Yann LeCun, chief AI scientist at Meta, said in the X post that “regulating[research and development]would have devastating consequences for the AI ecosystem.”
Elon Musk voiced his support for the bill on Monday night.
“This is a tough call and it will upset some people, but all things considered, I think California should probably pass SB 1047, the AI Safety Bill,” he posted on X. “For over 20 years, I have advocated for regulating AI, just as we do for any product or technology that poses potential risks to the public.”
Gov. Gavin Newsom has not publicly stated his position on the bill.
“We're ahead in this space, and we want to continue to be ahead in this space. We're not going to cede this space to other states or other countries,” he said at the AI Summit in May. “If we over-regulate, if we over-indulge, if we chase the hot button stuff, we could get into a dangerous position. But at the same time, we have an obligation to lead.”
State lawmakers are expected to vote on the bill later this week. If approved, the vote would have to go back to the Senate to approve changes made while the bill was in the Assembly. Lawmakers have until midnight Saturday to pass the new law for this year. The governor has until Sept. 30 to sign or veto it.
For more California top news, click here | Download our app | Subscribe to our morning newsletter