AI chatbots are becoming more common everywhere you can see even on kids’ apps. There is little research into how children and chatbots interact. Children may tend to overshare. Some parents are concerned.
Companies are rushing to add AI chat elements to consumer apps and services, including those aimed at children and teens.
They don’t understand exactly how young people interact with AI chatbots or what their potential social and emotional meanings are when they are regularly used. And some parents are worried, especially about younger children who may not understand what is realistic and what is not.
Chris, a mother in Los Angeles, asked her not to use her last name because she was concerned about her child’s privacy, but recently she said she had a surprising encounter with her 10-year-old daughter with an AI chatbot.
With her permission, her daughter downloaded an app that provided extra emojis to use on the iPad keyboard. One day, the app suddenly added an AI chatbot with an “Ask AI” feature that suggests a child-friendly search for Pokémon and Skividi toilet memes.
Chris’ daughter was talking to the chatbot and gave her a name and told her what she was called. She told her mother that the AI chatbot was her “friend.” Chris, who is now unstable, says she has deleted the app.
AI also offers opportunities for children
Of course, there are also great opportunities for AI chatbots to help kids. It is useful for learning and school settings, entertainment, emotional and therapeutic situations.
In late 2023, singer Grimes partnered with toy makers to sell a gorgeous line of toys of AI chatbots, with Grimes himself speaking up and speakers and microphones calling out to the inside. It can chat with the kids. (I bought it right away – though my kids lost interest quite quickly.)
Another $800 AI robot, Moxie, touted it as capable of supporting social and emotional learning. The robot, which was launched early during the pandemic, eventually lost its funds and shut down. Parents were distraught when their children were obsessed with their robot friends. The company came up with an open source solution, allowing owners to continue with the robots after the company’s owners left them behind.
Related Stories
Research on AI chatbots for children is limited
Large language models like ChatGpt, or LLM, are still very new, and there was no scientific or academic research that could have been influenced by how teens and children use AI chatbots or being influenced by them. Apart from limiting sexual or violent content, there is no universal guidance on how to design chatbots for children.
Dane Witbeck of Pinwheel, who makes phones for kids, said it’s a bad idea to narrow down AI chatbots to apps for kids and teens. “When we provide technology for kids that isn’t designed for kids, it’s designed for adults — we already have real harm and sometimes our shortcomings are serious.”
Researchers at Cambridge University published their paper this June, urging children to aim for LLMS to be designed in a child-safe way, particularly considering what is called the “empathy gap” of chatbots that children don’t often pick up.
Ying Xu, an assistant professor of AI in learning and education at Harvard University, studies how AI can help elementary school children with literacy and mathematics. Xu sees the good possibilities of an educational environment. (She cited the Khan Academy Kids app as an example of how AI is often used for children.)
Xu told Business Insider that there is already research into how things like Siri and Alexa are used, but the more complicated nature of the new LLMS is not fully understood when it comes to children.
“There are studies that have begun to explore the links between ChatGpt/LLMS and short-term outcomes, such as learning specific concepts and skills with AI,” she said in an email. “However, there is little evidence on long-term emotional outcomes and requires more time for development and observation.”
James Martin, CEO of Dante, an AI company that creates chatbots for a variety of uses, including educators for children, told Business Insider that parents’ concerns are justified.
“Overshare is not possible and inevitable,” he said. “Kids tell AI what they don’t say to their parents, teachers, or friends. The AI doesn’t judge. It doesn’t guide it. Just respond.”
How adults can think of AI for children
Considering a child young enough to believe in Santa Claus, you can imagine that using a chatbot that speaks like a human can sometimes be confused. For some adults who have formed romantic attachments to AI chatbots, that’s difficult enough.
There are also concerns about how AI chatbots are used for mental health support. LLMS tends to reinforce what you are saying, rather than challenging you, as human therapists do.
Bark’s CMO Tatiana Jordan, a company that manufactures parental control monitoring software and phones designed for children and teens, and at a time when we weren’t sure how AI chatbots affect youth emotionally, we now say no one knows.
“We’re getting research into what ScreenTime has done to our kids over the last 15 years,” she told Business Insider.
Almost every industry watcher I spoke to agreed to one thing. In order for AI chatbots to stay here, parents need to think about ways to safely teach them how to use their children, rather than avoiding them altogether.
“None of us can stop anything that comes with AI,” Jordan said. “We have to educate our kids that it is a tool. It can be a positive tool or a harmful tool.”