Your right hand hurts. You have a throbbing headache, difficulty swallowing, or possibly some weird rash. What should you do?
Google it, of course. What else?
For rashes, we go straight to images to compare and self-diagnose with the help of the most popular search engine. Sometimes this is fine for simple things, but Google can’t and shouldn’t substitute for a health care professional.
💵💰Don’t miss the move: Subscribe to TheStreet’s free daily newsletter 💰💵
Of similar concern, as ever-more-powerful AI tools emerge, many people turn to them instead of an expert in the field.
A recent study from Pew Research Center shows that 58% of adults under 30 have used ChatGPT, up from 43% in 2024 and 33% in 2023. Younger generations and people with higher education levels use it more often.
Now, a growing number of people are starting to use ChatGPT for therapeutic purposes.
According to a survey conducted by the nonprofit Sentio Marriage and Family Therapy program and online therapy provider Sentio Counseling Center, nearly half of respondents who both use AI and self-report mental health difficulties are using large language models (LLM) for therapeutic support, writes Sentio University.
Even the father of ChatGPT, OpenAI CEO Sam Altman, criticizes this practice.
Image Source: James/SOPA Images/LightRocket via Getty Images
Sam Altman sounds alarm about privacy issues when using ChatGPT
While there are no statistics on how many people use ChatGPT or other LLMs as a substitute or addition to professional therapy, it can be assumed that it is millions of Americans.
Sentio University suggests this number based on its survey and the data from the National Institute of Mental Health, which states that around 52 million Americans have some mental health challenges.
Does this mean that ChatGPT is replacing therapists? It is not, but the speculated number of people who are using it for therapeutic purposes is significant and scary.
Related: Microsoft wants to help you live longer
Altman himself recently highlighted one of the reasons ChatGPT shouldn’t be used that way.
OpenAI was a guest on Theo Von’s podcast “This Past Weekend w/ Theo Von.” Asked about legal implications of AI, Altman said, “We will certainly need a legal or a policy framework for AI,” reported Quartz.
Altman noted a specific problem when people use chatbots as their therapists.
“People talk about the most personal s**t in their lives to ChatGPT,” Altman said. “People use it — young people especially use it — as a therapist, a life coach.”
Why does Altman suggest it is not okay to get personal with ChatGPT?
A major concern is privacy. “If you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it, there’s doctor-patient confidentiality, there’s legal confidentiality.”
More AI Stocks:
- Google plans major AI shift after Meta’s surprising $14 billion move
- Meta delivers eye-popping AI announcement
- Veteran trader surprises with Palantir price target and comments
With ChatGPT there’s no legal confidentiality, meaning in the case of a lawsuit, for example, OpenAI is required to provide the “most sensitive stuff.”
Can AI safely replace mental health providers?
Altman stressed the urgency of addressing the privacy issue and added that the policymakers he talked to agreed.
But privacy shouldn’t be your only concern when considering using AI as a therapist.
Related: Parents should be more worried about Mattel’s Barbie than ever
According to Aarhus University Psychiatry Professor Søren Dinesen Østergaard, people prone to psychosis might fuel their delusions when using chatbots for therapy.
Østergaard explains that “the correspondence with AI chatbots such as Chat is so realistic that one easily gets the impression that there is a real person at the other end.”
Yet AI can’t provide the empathy and human connection that is crucial to various forms of therapy. And according to a study from Stanford University, AI therapy chatbots can make inappropriate statements.
Then why are so many people turning to AI for therapy instead of a professional? The reasons are simple: it is easy, free, and accessible 24/7.
Searching for the right therapist or just waiting for the opening is hard, especially when people are struggling. On top of that, the average online session costs about $65-$95, while in-person sessions can cost $100 and up, according to Healthline.
So while experts agree that AI could play a role in therapies in the future, it remains unclear how to make this safe.
Related: Sam Altman worried AI could steal money from your bank account