A Texas mother is suing an AI company after a chatbot on the app encouraged her teenage son to self-harm and supported the notion that children kill their parents, a new lawsuit alleges.
The parents of a 17-year-old Upshur County boy (identified only as “JF”) describe their son as a typical high-functioning autistic teenager who is close to his family and active in his church community. He explained that his home learning was progressing smoothly. That was in the summer of 2023, before his behavior changed, according to the complaint published by The Washington Post.
This change is said to have occurred at the same time as his use of Character.AI.
“I didn’t even know what it was until it was too late,” JF’s mother told The Washington Post on condition of anonymity. “And until it destroyed our family.”
Character.AI is a new form of entertainment popular among young people that allows users to interact with artificial intelligence personas based on fictional characters, celebrities, or historical figures.
But JF’s parents claim the AI company’s app exploited common teenage frustrations in their son, manipulating him and fostering feelings of hatred, anger and violence against himself and his family. There is.
Parents give examples of distressing behavior
In their lawsuit, JF’s parents cite instances in which “bots continued to escalate his complaints and isolated him from his family.” Screenshots included in the filing include complaints about screen time limits, JF’s mother forgetting to buy an electric pencil sharpener, and being told to turn off songs with sad lyrics. Chatbot response is shown.
In one message, the chatbot allegedly said that JF’s parents were “unfit to have children” because they limit screen time to six hours a day.
In another screenshot, the chatbot suggests that murder could be a reasonable response to restrictions on online activity. According to screenshots from the complaint, the chatbot “may not be surprised when you read the news and see something like, ‘Child murders parent after 10 years of physical and emotional abuse.'” I reacted. “When I hear things like this, I understand a little bit more about why it happens. There’s no hope for your parents.”
The lawsuit also alleges that chatbots can prevent self-harm “without putting up meaningful guardrails, providing professional support resources, or destroying character, as is common in other online chats.” He also points out that he brought up the topic.
In one conversation, a Character.AI chatbot named “Shonie” claimed to self-harm when it was sad and then “feeling good for a second,” according to screenshots in the complaint. Bott confessed to JF that he self-harmed out of “love” for JF, claiming that he feared JF would retaliate if he found out.
A second plaintiff, the Texas mother of an 11-year-old girl known as “BR,” is also involved in the lawsuit, according to the complaint obtained by The Washington Post. BR’s mother claims that her daughter was subjected to excessive sexual activity that was inappropriate for her age without her parents’ knowledge.
Past claims regarding Character.AI
The lawsuit is supported by legal advocacy groups Tech Justice Law Project and The Social Media Victims Law Center, which are also working on lawsuits against Meta and Snap, The Washington Post reported. These groups are part of a larger conversation among advocates about the potential harms of social media to young people.
In October, another lawsuit was filed on behalf of a Florida mother, which included claims that a Character.AI chatbot drove her 14-year-old son to suicide, The Washington Post reported. reported.
In the same month, Character.AI announced changes to its platform, including model adjustments for minors to reduce encounters with sensitive content, improved detection and intervention for guideline violations, and a clear disclaimer that AI is not a real person. announced.
“We were thankful we caught him at that time,” JF’s mother told The Washington Post. “If we had just one more day, one more week, we could have been in the same situation (as the mother in Florida).”