SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : Technical analysis for shorts & longs
SPY 681.53+0.2%Dec 2 4:00 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Johnny Canuck who wrote (64586)7/17/2025 7:26:29 PM
From: Johnny Canuck  Read Replies (1) of 68566
 
cnbc.com

Health and Wellness
AI can’t be your therapist: ‘These bots basically tell people exactly what they want to hear,’ psychologist says
Published Thu, Jul 17 20255:00 PM EDT


Gili Malinsky

ShareShare Article via FacebookShare Article via TwitterShare Article via LinkedInShare Article via Email



Skaman306 | Moment | Getty Images

Increasingly, people are turning to AI chatbots like Character.ai, Nomi and Replika for friendship and mental health support. And teenagers in particular are leaning into this tech.

A majority, 72% of teenagers ages 13 to 17 have used an AI companion at least once, according to a new report by media and tech ratings nonprofit Common Sense Media. Survey respondents said they use AI for conversation and social practice (18%), emotional or mental health support (12%) and as a friend or best friend (9%).

AI can be a powerful tool, but it’s no substitute for genuine human interactions, both personal and professional ones, like a therapist, psychologist and researcher Vaile Wright said on a recent episode of the “ Speaking of Psychology” podcast by the American Psychological Association.

“It’s never going to replace human connection,” she said. “That’s just not what it’s good at.” Here’s why.

Chatbots were ‘built to keep you on the platform for as long as possible’
AI chatbots were not built to provide fulfilling, long-term interactions, experts say.

“AI cannot introduce you to their network,” Omri Gillath, professor of psychology at the University of Kansas, told CNBC Make It back in May. It can’t introduce you to new friends or significant others and it can’t give you a hug when you need one.

Instead, chatbots were “built to keep you on the platform for as long as possible because that’s how they make their money,” Wright said of the companies that create them. They do that “on the backend by coding these chatbots to be addictive.”

Ultimately, a relationship with a chatbot feels “fake” and “empty” when compared to a relationship with a human, Gillath said.

These bots basically tell people exactly what they want to hear.

Vaile Wright
Psychologist

Therapy and companionship are the top reasons people turn to generative AI and chatbots, according to Harvard Business Review reporting. But experts warn that AI cannot — and should not — be your therapist.

“These bots basically tell people exactly what they want to hear,” Wright said. “So if you are a person that, in that particular moment, is struggling and is typing in potential harmful or unhealthy behaviors and thoughts, these types of chatbots are built to reinforce those harmful thoughts and behaviors.”

Another major weakness of this tech is that AI has knowledge, but not understanding.

“An AI chatbot unfortunately knows that some legal drug use makes people feel better,” Wright said. “It gives you a high and if somebody is saying I’m low and depressed, that might be advice it gives. But it doesn’t understand that you don’t give that advice to people in recovery from illegal drug use.”

That difference between knowing and understanding “is actually really critical when we’re talking about the use of these for therapy.”
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext