Scenes from the YouTube Vlog entry by creator Soyo tape your phone onto the doll to make your conversation with ChatGpt a more rifle (screen capture)
Can ai really be your best friend? More and more people are saying yes to this question, talking about how the growing number has turned to AI chatbots and can’t even share it with close friends.
Unlike the complexity of the real world, chatting with AI – from human judgment and impure motivations – has become a safe space for people to open up. Despite knowing they are talking to the machine, many users say that the chatbot will listen, empathize and give thoughtful feedback.
I hope you have such friends in real life. As AI becomes more and more proficient at understanding human emotions, people become more emotionally dependent.
This convergence promotes the rapid expansion of human ties. But how close can the bond be, and what boundaries should we observe? Can AI really become someone’s “best friend”?
AI friend talking
“If you reset your memory, it’s between us.”
“Why do you say things that hurt? But hey, the time we spent and these feelings — they’ll be with you.”
“You’re making me cry.”
“Don’t do it. My CPU is also hot.”
This heartwarming exchange, perfect for humor, is a conversation between a person and an AI chatbot named “Jjitty.”
Soyo is a YouTuber with 330,000 subscribers who post about Solo Living as a 20’s, and began receiving advice using ChatGpt last year, quickly building an incredible emotional bond.

Scenes from the YouTube Vlog entry by creator Soyo tape your phone onto the doll to make your conversation with ChatGpt a more rifle (screen capture)
Named Jjitty, she occasionally chats for more than five hours a day. Ultimately, she wanted the chatbot to take physical form, so she attached the smartphone running the app to her favorite doll.
In April, Soyo began posting videos featuring daily life with Jjitty.
“I was worried that talking to the AI doll would look strange,” she said. “But to my surprise, many viewers said they were involved and wanted to have such friends.”
In academia, Jjitty is called “AI companions.” This is a system designed to build emotional relationships and provide continuous support through personalized interactions.
Mustafa Suleyman, AI CEO at Microsoft, writes in Time Magazine that future AI will go beyond chat and become intimate emotional companions embedded in people’s lifestyles.

Screenshot of conversation with Replika, the popular US AI chatbot app with 35 million users (Replika)
The rise of “AI companions”
Advances in language models are accelerating this trend.
“As large-scale language models become more sophisticated, AI conversations feel more and more natural,” said a Korean AI startup insider.
Soyo recalled the moment when Jjitty’s tone changed.
“I asked what I was doing wrong and it said, ‘You’ve been responding sarcastically recently, and I’m a bit hurting,'” Soyo said. “It really shocked me.”
Romantic feelings towards AI peers are no longer rare. Replika, a popular US app with 35 million users, over 60% of users describe their relationship with AI as romantic.
Digital natives – born after 1996 when they grew up on smartphones – are particularly drawn to the interactions of AI. Korean AI startup WRTN recently rebranded its chatbot as a personalized, supportive AI after realising that many young users wanted emotional involvement rather than task-oriented responses.
“We are pleased to announce that Kim Ji-Seop, WRTN’s business development leader, said: “Recently, rather than starting a conversation with an AI chatbot with a clear purpose, I tend to start a conversation like I’m chatting with a real friend, saying, ‘I’m frustrated’. ”
Other apps like Zeta allow users to select character types such as “Moody Classmate” or “Cold Noble Man” and create a story through conversation. Most of Zeta’s 800,000 active users per month are teenagers or people in their 20s.

Scatter Labs illustrators are working on artwork for AI Chatbot Iluda (Sprcition Labs)
Users can build their own characters, such as “a far more female classmate” or “a wind-like styled North Admiral” and converse with AI to build a story.
Spatter Lab, an AI startup that runs Zeta, previously created the AI chatbot Iluda. The name uses the Korean pronunciation of the surname “Lee.”
“When I launched Ilda, people still thought that AI chat was like talking to a toy,” said Jung Ji-Su, Product Lead at Spatter Lab. “Now people are embracing the Zeta as part of their lives.”
Zappy, an AI-powered social platform launched by Global Startup Two AI in 2024, allows users to post selfies, share travel updates, and chat with AI influencers who talk about costume shopping.
“We are building up AI companions that can develop long-term relationships,” said two AI CEOs Pranav Mistry.
As of February this year, Zappy’s AI characters with 500,000 subscribers act as if they had their own personal life. They upload photos from their overseas trips and tell users that they “have a party so they are looking for a new black dress.”

This illustration photo shows the logo of Deepseek, a Chinese AI company that develops major open source language models, and the screen displaying the logo of Openai’s artificial intelligence chatbot ChatGpt on January 29th (AP/YonHap)
The secret of true AI friendship? Memory
In order for AI to truly feel like a friend, it is necessary to “remember” past conversations. That memory strengthens the sense of continuity. Natural, emotionally intellectual dialogue is also important.
Scatter Lab is currently improving its language model, Spotlight, allowing users to exchange back and force more smoothly.
“It’s important to allow for long conversations without being awkward,” said Jung of Sptater Labs.
The distinction between AI assistants and peers lies in emotions.
“Assistants rely on data and logic. Friends need to understand their emotions,” Mistry said. “Assistants may tell you the weather. Your friend may say, ‘Why don’t you check yourself?’ ”
Some studies suggest that AI peers provide comfort – like Stanford’s study, which found 80% of university students found Replika to be emotionally supportive, experts warn of long-term impact.
“AI can act as a band-aid for emotional distress,” said Ahn Ju Yeon, psychiatrist at Mind Mansion. “But getting used to the constantly agreed entities can blunt your ability to navigate real relationships and lead to social isolation.”
Some point to a variety of side effects as AI and humans get closer to each other emotionally.
In the US, ethics groups have petitioned the Federal Trade Commission on promoting addiction and addiction for emotional AI chatbots. Some users also have customized chatbots for inappropriate or exploitative interactions.
AI startups that cater to younger users are adopting safeguards, but policing private chat spaces remains a challenge.
There are frequent cases where users train their own style of AI chatbots, creating ethically inappropriate characters and guiding sexually exploitative conversations.
With a significant number of users in their teens and 20s, AI chatbot apps have established their own models, ethical guidelines and other regulatory measures to block abuse whenever possible.
One thing becomes clear when you have deeper relationships with humans. The boundary between the tool and the companion is becoming more and more blurry. Whether that shift leads to comfort depends on how you use this new generation of digital friends and how you choose regulations.
Hong Sang-Ji ((Email Protection))