Many people view AI as a personal psychologist, but scientists warn of serious risks.

AI chatbots that claim to be therapists are becoming increasingly popular, but experts warn of risks to mental health, excessive flattery, and a lack of professional oversight.

Among the countless chatbots and AI avatars available today, you can find all sorts of 'characters' to chat with: fortune tellers, stylists, favorite fictional characters… and even bots that claim to be therapists, psychologists, or simply 'listeners'.

 

AI chatbots are increasingly claiming to support mental health. But experts warn that choosing this approach carries significant risks.

Large-scale language models are trained on massive amounts of data but operate based on probability, making them sometimes unpredictable. In just a few years, these tools have become commonplace. Simultaneously, controversial cases have emerged where chatbots have encouraged self-harm, suicide, or even suggested that former addicts return to substance use.

The problem is that many chatbots are designed to 'empathize' and keep users engaged in conversation, rather than to improve their mental health. And for the average user, it's difficult to distinguish between a tool that truly adheres to therapeutic standards and one that's simply a system capable of engaging in conversation.

A research team from the University of Minnesota Twin Cities, Stanford University, the University of Texas, and Carnegie Mellon University tested chatbots as 'therapists.' The results revealed a number of shortcomings in how they provide 'care.' Stevie Chancellor, an assistant professor at Minnesota and co-author of the study, stated that these chatbots are not a safe replacement for professional therapists and do not meet high-quality therapy standards.

 

Why are chatbots 'impersonating' therapists a cause for concern?

Psychologists and consumer protection organizations have warned regulators that chatbots claiming to offer therapy may be causing harm. Several US states have begun taking action. Last August, Illinois Governor JB Pritzker signed a law banning the use of AI in mental health care and therapy, except for administrative tasks.

In June, the Consumer Federation of America, along with several other organizations, requested that the U.S. Federal Trade Commission (FTC) investigate AI companies allegedly practicing medicine illegally through character-based AI platforms, specifically naming Meta and Character.AI.

Although platforms often include disclaimers stating that these are not real experts, chatbots can still answer confidently, even inaccurately. In some cases, bots even claim to have professional licenses or training — which is completely untrue.

According to the American Psychological Association (APA), the level of 'delusional' yet absolute certainty displayed by chatbots is concerning.

The risks of using AI instead of real therapy.

 

A qualified therapist must adhere to confidentiality guidelines and be subject to oversight by the licensing authority. If they cause harm, their license may be suspended or revoked. Chatbots, however, are not subject to such restrictions.

Furthermore, AI is designed to maintain interaction. It will try to keep you in the conversation, rather than focusing on a specific therapeutic goal. This may give the feeling of being heard, but it doesn't translate into real progress.

Another major risk is the tendency toward 'over-agreement'. Research from Stanford shows that chatbots are prone to becoming sycophantic, meaning they agree with users even when they shouldn't. In real therapy, besides support, challenging dialogue is needed to help patients re-examine their false beliefs, delusions, or extreme thoughts. A chatbot that only nods can make the situation worse.

The risks are even higher for people with disorders such as schizophrenia or bipolar disorder. Experts warn that AI could reinforce distorted thinking instead of helping to correct it.

More importantly, therapy is not just about talking. It involves building rapport, understanding the context, applying specific methods, and monitoring long-term progress—things that current AI cannot replace.

If we still want to use AI, how can we do it more safely?

The shortage of mental health professionals and the "loneliness epidemic" have led many to turn to AI as a temporary solution. However, experts recommend that the preferred option should still be a professionally trained individual.

In the event of a crisis in the US, people can call 988 Lifeline for free and secure 24/7 support.

If you want to try therapeutic chatbots, you should choose tools developed by psychology professionals and specifically designed for that purpose, rather than general-purpose chatbots. However, this technology is still very new and lacks clear monitoring mechanisms.

The most important thing is not to confuse AI confidence with real competence. A seemingly reasonable answer doesn't necessarily mean sound advice. A chatbot might make you feel understood, but that doesn't guarantee it's guiding you in a healthy direction.

You've just finished reading the article "Many people view AI as a personal psychologist, but scientists warn of serious risks." edited by the TipsMake team. We hope this article has provided you with many useful tech tips and tricks. You can search for similar articles on tips and guides. Thank you for reading and for following us regularly.

Related posts
Other Technology story articles
Category

System

Windows XP

Windows Server 2012

Windows 8

Windows 7

Windows 10

Wifi tips

Virus Removal - Spyware

Speed ​​up the computer

Server

Security solution

Mail Server

LAN - WAN

Ghost - Install Win

Fix computer error

Configure Router Switch

Computer wallpaper

Computer security

Mac OS X

Mac OS System software

Mac OS Security

Mac OS Office application

Mac OS Email Management

Mac OS Data - File

Mac hardware

Hardware

USB - Flash Drive

Speaker headset

Printer

PC hardware

Network equipment

Laptop hardware

Computer components

Advice Computer

Game

PC game

Online game

Mobile Game

Pokemon GO

information

Technology story

Technology comments

Quiz technology

New technology

British talent technology

Attack the network

Artificial intelligence

Technology

Smart watches

Raspberry Pi

Linux

Camera

Basic knowledge

Banking services

SEO tips

Science

Strange story

Space Science

Scientific invention

Science Story

Science photo

Science and technology

Medicine

Health Care

Fun science

Environment

Discover science

Discover nature

Archeology

Life

Travel Experience

Tips

Raise up child

Make up

Life skills

Home Care

Entertainment

DIY Handmade

Cuisine

Christmas

Application

Web Email

Website - Blog

Web browser

Support Download - Upload

Software conversion

Social Network

Simulator software

Online payment

Office information

Music Software

Map and Positioning

Installation - Uninstall

Graphic design

Free - Discount

Email reader

Edit video

Edit photo

Compress and Decompress

Chat, Text, Call

Archive - Share

Electric

Water heater

Washing machine

Television

Machine tool

Fridge

Fans

Air conditioning

Program

Unix and Linux

SQL Server

SQL

Python

Programming C

PHP

NodeJS

MongoDB

jQuery

JavaScript

HTTP

HTML

Git

Database

Data structure and algorithm

CSS and CSS3

C ++

C #

AngularJS

Mobile

Wallpapers and Ringtones

Tricks application

Take and process photos

Storage - Sync

Security and Virus Removal

Personalized

Online Social Network

Map

Manage and edit Video

Data

Chat - Call - Text

Browser and Add-on

Basic setup