File – Icons for ChatGpt, Gemini, Microsoft Copilot, Claude, Perplexity apps will be displayed on your Google Pixel smartphone. AI competition concept. Getty Images
The AI chatbot is stepping into the therapist’s chair, but not everyone is excited about it.
In March alone, 16.7 million posts from Tiktok users discussed using ChatGPT as therapists, but mental health experts are holding a red flag on growth trends that confirm that artificial intelligence tools are being used to treat anxiety, depression and other mental health challenges.
“ChatGpt made me an uneasy person when it comes to dating, health, career,” user @Christinazozolya shared in a Tiktok video posted on her profile last month.
“Whenever I’m feeling anxious, rather than firing my parents with texts like before, texting my friends, or essentially crashing, I always do a good job of making notes on chatgpt, calm me down and providing instant relief that unfortunately doesn’t have access to anyone.”
Parents trust AI for doctor advice more than doctors, researchers found
Others use the platform as a “crutches” that includes user @karly.bailey. He always uses the platform for “free therapy” for “free therapy” and uses it as someone who works for a startup company and does not have health insurance.
“I tell you what’s going on, how I feel and literally every detail as if I had my girlfriend. It would give me the best advice,” she shared.
“It also offers journaling prompts or EFT (Emotional Freedom Tapping).
These users are not alone. A study by Tebra, an operating system for independent healthcare providers, found that “one in four Americans are more likely to talk to an AI chatbot instead of taking part in treatment.”
In the UK, some young adults choose the perceived benefits of AI mental health consultants that are convenient over long National Health Services (NHS) wait times, avoiding paying private counseling, which costs around £400 (approximately $540).
According to the Times, data on reconsideration mental illness discovered that over 16,500 people in the UK are still waiting for mental health services in 18 months, indicating that the cost burden, waiting times and other hurdles associated with seeking healthcare can exacerbate the urge to use more cost-effective and convenient methods.
I’m a technical expert: 10 AI prompts Prompts you always use
However, critics say these virtual bots may be accessible and convenient, but lack human empathy, and those in crisis mode risk never receiving the customized approach they need.
“I actually spoke to ChatGpt, and I tested some prompts to see how responsive they were. ChatGpt can get information from Google, integrate it, and take on the role of a therapist.”
Some GPTs, such as therapist GPT, are specially tailored to provide “comfort, advice, and therapeutic support.”
It’s probably more cost-effective than traditional therapy, which costs $20 a month on CHATGPT Plus. This allows for user benefits such as unlimited access, faster response times, but the platform cannot scale to experts who can diagnose, prescribe medication, monitor progress, or reduce serious issues.
“I don’t think it’s a replacement for a real therapist who can feel the treatment and support people, but can help them get through more complex mental health issues,” Sarfo added.
The woman says ChatGpt saved her life by helping to detect cancer. The doctor missed
He said there is a risk for people who have years of expertise in dealing with Mental Health issues and have learned how to adjust to a variety of situations, combining advice from tools like ChatGpt with legitimate advice and legal advice.
“I am particularly concerned about people who may need psychotropic drugs, using artificial intelligence to help them feel better and use them as treatments. But sometimes… it shows that treatments and medications are shown. So there is no way to take the right medication without going to a real expert.
However, some aspects of a chatbot can be beneficial for those who need support, especially those looking for ways to chat with their doctor about conditions such as ADHD.
“(You) can list some aggressive prompts and state those prompts to your provider and make your symptoms a little better, so I think it plays a role that artificial intelligence can play, but if people start to rely on it, that’s a bad thing.
Earlier this year, Dr. Christine Yu Moutier, chief medical officer of the American Foundation for Suicide Prevention, warned against the use of technology for mental health advice, informing Fox News Digital that there is a “significant gap” regarding the intentional and unintended impact of AI on suicide risk, mental health and greater human behavior.
“The problem with these AI chatbots is that they are not designed with expertise in suicide risk and prevention. What’s more, there is no helpline for platforms for users at mental health and risk of suicide.
Dr. Moutier also explained that chatbots can fail to decipher the literal language, which may lead to the possibility that someone may not be able to properly determine whether they are at risk of self-harm.
Nikolas Lanum from Fox News contributed to this report.
Learn more about Fox News Digital
Mental HealthEchnologyTechnologyNews