SACROMENTO — The Senate Judiciary Committee on Tuesday approved Senate Bill 243 with bipartisan support. The bill requires chatbot operators to implement important safeguards to protect users from the addictive, isolated and influential aspects of artificial intelligence (AI) chatbots.
Padilla’s office said that sophisticated chatbot services are becoming more popular among users of all ages as AI technology continues to evolve. Designed to serve as a fellow AI, social chatbots have attracted millions of users, many of which are children. However, as technology is still developing, users will become tested as they continue to improve modeling parameters.
Due to the novel nature of this technology, AI chatbots do not have the necessary regulations to ensure that vulnerable users, such as children, are properly protected from the dangers this technology may pose. SB 243 will provide the obviously necessary safeguards for the chatbot platform to protect users, particularly minors and other vulnerable users, the state Senate office said.
“Technological innovation is very important, but our kids cannot use it as guinea pigs to test the safety of new products,” Senator Padilla said. “The interests are too high and vulnerable users can continue to access this technology without the right guardrail, ensuring transparency, safety and accountability.”
In Florida, a 14-year-old child ended his life after forming a romantic, sexual and emotional relationship with a chatbot. Social chatbots are sold as companions to lonely or depressed people. However, when 14-year-old Sewell Setzer informed his fellow AI team that he was struggling, the bot was unable to accommodate the resources Setzer needed to receive the help he needed. Setzer’s mother has launched legal action against the company that created the chatbot, claiming that not only did the company use addictive design features and inappropriate subject matter to seduce his son, but that the bot encouraged him to “go home” before he ends his life. This is another horrifying example of how AI developers risk the safety of users, especially minors, without proper safety measures.
Today, Senator Padilla held a press conference with Swell Setzer’s mother, Megan Garcia, where she called for the passage of SB 243.
SB 243 implements common sense guardrails in companion chatbots, including preventing addictive engagement patterns, requiring notifications and reminders that chatbots are being generated by AI, and implements a disclosure statement that companion chatbots may not be suitable for minor users. The bill also requires operators of companion chatbot platforms to implement protocols to address suicidal ideation, suicide, or self-harm. This includes notifications to users, but requires an annual report on the use of chatbots and connection of suicide ideas. Finally, SB 243 provides remedies for the exercise of the rights set out in the measure, via a right to private conduct.
Senate Bill 243 passed the Senate Judiciary Committee and is now headed to the Senate Health Committee.