Khari Johnson, Calmatters
This story was originally published by Calmatters. Sign up for their newsletter.
California’s first privacy agency has withdrawn from attempts to regulate artificial intelligence and other forms of computer automation.
California’s Privacy Protection Agency was under pressure to move away from draft rules. Business groups, lawmakers and Gov. Gavin Newsom said it would take over the powers of Congress, where AI regulations are growing, cost businesses and potentially curb and take down innovation. Last week’s unanimous vote, the agency’s board of directors watered down the rules and put protections on systems like AI.
Agent staff estimates that the change will reduce the costs of companies complying between $834 million and $143 million in the first year of execution, and that 90% of companies needed to initially comply will not need to do so.
The retreat presents a key turn in the ongoing and heated debate over the role of the board. After lawmakers passed the state privacy laws in 2018 and voters were created in 2020, the agency is the only organisation of its kind in the United States.
The draft rule has been in work for over three years and has been revisited after a series of changes in recent months, including the departure of two leaders, including Vinhcent Le, a board member who led the AI rules drafting process, and Ashkan Soltani, the agency’s executive director.
Consumer advocacy groups are worried that recent changes mean that agents are overly postponing businesses, particularly high-tech giants.
The changes approved last week mean that the agency’s draft rules no longer regulate behavioral advertising. It targets people based on profiles built from online activities and personal information. In previous drafts of the regulations, companies would have had to conduct risk assessments before using or implementing such advertisements.
Behavioral ads are used by companies such as Google, Meta, Tiktok and its business clients. It can perpetuate inequality, pose a threat to national security and put children at risk.
The revised draft rules also eliminate the use of the phrase “artificial intelligence” and narrow the scope of business activities regulated as “automated decision making.” This also requires an assessment of the risks in the processing of personal information and the protections introduced to mitigate them.
Advocates of stronger rules say that the narrower definition of “automated decision making” allows employers and businesses to opt out of the rules by arguing that algorithmic tools are merely advice for human decision making.
“One concern for me is that if we’re just asking the industry to identify what risk assessments actually look like, then we can get to the position they’re writing graded exams,” said board member Brandi Nonsecke during the meeting.
“The CPPA is accused of watering down proposed rules to protect Californian data privacy and benefit Big Tech,” said Sacha Haworth, executive director of Tech Oversight Project, advocacy group focusing on challenging policies that strengthen Big Tech Power. “What was the point before these rules were published?”
Draft rules retain some protection for workers and students when fully automated systems determine financial and lending services, housing and healthcare outcomes without humans in a decision-making loop.
The companies and organizations that represent them accounted for 90% of comments on the draft rules last year before government agencies held statewide listening sessions, Soltani said at last year’s meeting.
In April, after pressured the rules from business groups and lawmakers to weaken rules, a coalition of nearly 30 unions, digital rights and privacy groups wrote together to urge institutions to continue their work to regulate AI and protect consumers, students and workers.
“Each iteration they became weaker and weaker.”
Kara Williams, law fellow at the Center for Electronic Privacy Information regarding drafting AI rules from California privacy regulators.
Almost a week later, Governor Newsom stepped in and sent a letter to the agency that the rules had stepped over the authority of the agency and agreed to critics that they supported the proposal to roll them back.
Newsom cited Proposal 24, the voting measure for 2020 that paved the way for agency. “An agency may fulfill its obligation to issue regulations invoked by Prop. 24 without challenging areas beyond the order,” the governor wrote.
Kara Williams, a law fellow at the Advocacy Group Electronic Privacy Information Center, said the original draft rules were great. In a call ahead of the vote, she added, “it becomes weaker and weaker with each repetition, and it appears that these regulations correlate fairly directly with pressure from the technology industry and the trade association group, so that consumers are protected more and less.”
The public will need to comment on changes to the draft regulations until June 2nd. Businesses must adhere to automated decision-making rules by 2027.
Last week, before halting its own regulations, the agency committee voted to vote behind the California Legislature’s four draft bills.
This article was originally published on Calmatters and was republished under the Creative Commons Attribution-NonCommercial-Noderivatives license.