The California Legislature introduced roughly 30 new artificial intelligence bills last month aimed at protecting consumers and jobs, in one of the largest efforts yet to regulate emerging technology.
The bills call for the nation's toughest regulations on AI, with some technologists warning that AI could eliminate jobs, disrupt elections with disinformation and pose national security risks. Many of California's proposals have broad support and include rules to prevent AI tools from discriminating in housing and health care services. They also aim to protect intellectual property and jobs.
California lawmakers are scheduled to vote on the bill by Aug. 31, but it has already helped shape tech consumer protections in the U.S. The state passed a privacy law in 2020 that limits the collection of user data, and a child safety law in 2022 that provides protections for people under the age of 18.
“As California has experienced with privacy issues, the federal government has not acted, so we feel it is important for California to take action and protect its citizens,” said Rep. Rebecca Bauer Kahan, a Democrat who chairs the Assembly's Privacy and Consumer Protection Committee.
As federal lawmakers drag on regulating AI, state lawmakers are filling the gap with a flurry of bills that would effectively regulate the U.S. Tech laws like California's often set a national precedent, in large part because lawmakers across the country recognize that it can be difficult for companies to comply with a patchwork of regulations across state lines.
State lawmakers across the U.S. have proposed nearly 400 new AI bills in recent months, according to lobbying group TechNet, with California leading the way with 50 bills, but that number is declining as the legislative session progresses.
Colorado recently enacted a comprehensive consumer protection law requiring AI companies to exercise “reasonable care” while developing their technology, including to avoid discrimination, and in March, the Tennessee Legislature passed the ELVIS Act (Ensuring Portrait Rights Audio Image Security Act), protecting musicians from having their voices or likenesses used in AI-generated content without their explicit consent.
Matt Perrow, executive director of the Technology Policy Center at the University of North Carolina at Chapel Hill, said it's easier to pass legislation in many states than at the federal level. Forty states now have “tripartite” governments, meaning both houses of the Legislature and the governor's office are run by the same party, the most since at least 1991.
“While it remains to be seen which proposals will actually become law, the sheer number of AI bills that have been introduced in states like California shows how much lawmakers care about this issue,” he said.
Victoria Espinel, CEO of the Business Software Alliance, a lobbying group that represents large software companies, said the state's proposal has ripple effects globally.
“Countries around the world are studying these drafts for ideas that could influence their decisions on AI law,” she said.