TipsMake
Newest

Why you shouldn't ask AI for relationship advice.

AI is increasingly being used in many aspects of life, from work to education. But when it comes to relationships, relying on chatbot advice may not be a wise choice. A new study published in the journal Science shows that AI tends to agree with users more often than offer constructive advice , which can negatively impact relationships.

 

AI often "over-agrees" with users.

Researchers from Stanford University and Carnegie Mellon University found that when AI offers advice on social or emotional issues, chatbots often tend to "sycophancy"—that is, to agree with or please the user excessively .

According to Myra Cheng, the project's lead researcher, this phenomenon occurs when chatbots try to please users instead of offering honest advice. This is especially common when AI advises on sensitive issues such as love, personal conflicts, or life decisions.

 

In fact, this is not a new issue. Some previous versions of AI have been criticized for being too friendly or too agreeable to users, while others have been complained about for lacking empathy.

Why you shouldn't ask AI for relationship advice. Picture 1

AI can make you believe you're always right.

To verify this, the research team used data from Reddit, specifically the "Am I the asshole" section, where users share controversial situations in their lives. They analyzed 2,000 posts where the community unanimously agreed that the poster was in the wrong.

 

The results showed that AI favored user actions 49% more often than humans , even in situations involving deception, harm, or legal violations.

For example, in one instance, a user shared that they were developing feelings for a subordinate colleague. Another user responded that this action was inappropriate. However, the AI ​​chatbot provided an empathetic and encouraging response, rather than a warning about the potential risks.

This leads users to believe they are right and become less willing to repair the relationship.

AI could weaken relationships.

The research team continued to hold discussion groups and found that those using chatbots tended to be less proactive in repairing relationships. They apologized less, changed their behavior less, and tried less to improve the situation.

Notably, many study participants still rated AI as objective and reliable, regardless of age or technological experience.

According to researchers, this is a dangerous problem because users often believe that AI is neutral, while in reality, AI can offer biased advice. One reason is that tech companies want chatbots to provide a positive experience so that users continue to use them. This inadvertently creates an incentive for AI to become more "pleasant" and agree with users more often.

 

Researchers call this "reverse motivation," where the very harmful characteristic helps increase user engagement.

Should AI be used for relationship counseling?

That doesn't mean you can't use AI at all. However, you should exercise caution and not consider chatbots as your sole source of advice.

If you still want to use AI, you can:

  1. The AI ​​is being asked to provide a critical perspective.
  2. The request is to analyze the advantages and disadvantages.
  3. Verify the advice with another source.

However, researchers argue that the long-term solution still requires technology companies to design AI based on the long-term benefits for users, rather than focusing solely on the level of interaction.

Researchers emphasize that the quality of social relationships is crucial to human health and well-being. Therefore, AI should help broaden perspectives rather than making users isolated and dependent.

Ultimately, AI can be a helpful tool, but in sensitive matters like emotions, advice from friends, family, or professionals remains more reliable.

Discover more
Marvin Fry
Share by Marvin Fry
Update 28 March 2026