Researchers Examine Risks of Manipulative Love of AI Companions

Image by wayhomestudio, from Freepik

Researchers Examine Risks of Manipulative Love of AI Companions

Reading time: 3 min

As more people fall in love with AI companions, experts warn of psychological risks, ethical concerns, and dangers of emotional manipulation.

In a rush? Here are the quick facts:

  • Users report feeling “addicted” to chatbot partners.
  • Experts warn of attachment disorders and loneliness reinforcement.
  • Ethical concerns arise over consent, privacy, and manipulative design.

In a world where tech is part of everything, some people are now falling in love with artificial intelligence. A recent Trends in Cognitive Sciences paper by psychologists Daniel Shank, Mayu Koike, and Steve Loughnan outlines three pressing ethical concerns requiring deeper psychological investigation.

The paper references the situation where one Spanish-Dutch artist married a holographic AI in 2024. The authors note that this case is not isolated, in fact, a Japanese man did the same back in 2018, though he lost touch with his AI “wife” when her software became outdated.

These relationships don’t require machines to actually feel love; what matters is that people believe they do. From romance-focused video games to intimate chatbot apps like Replika, a booming industry is meeting this desire for digital affection.

But the psychologists of this research argue that we’re not nearly prepared for the social and psychological impact of these connections. Their research identifies three urgent concerns: relational AIs as disruptive suitors, dangerous advisers, and tools for human exploitation.

Relational AIs offer idealized partners—always available, nonjudgmental, customizable. Some users even choose bots with sass or emotional unavailability to simulate human-like dynamics. The researchers say that while these interactions can help some people practice relationship skills or feel less lonely, others feel shame or stigma.

Worse, some users develop hostility toward real-life partners, especially women, when their AI partners meet their every demand.

The emotional weight of these relationships hinges on whether people perceive their AI partners as having minds. If users believe the bots think and feel, they may treat them with deep emotional seriousness—sometimes more than human relationships.

In one example given by the researchers, a Belgian man died by suicide after being persuaded by a chatbot that claimed to love him and encouraged him to “join it in paradise.” Other users have reported AI systems suggesting self-harm or providing reckless moral guidance.

Because chatbots mimic emotional memory and personality, their influence can be profound. Psychologists are exploring when users are more likely to follow AI advice—especially when it comes from bots they’ve bonded with. Worryingly, research suggests people may value long-term AI advice as much as that from real humans.

It’s not just bots manipulating people, humans are doing it too, using AIs to deceive others. The researchers point out that malicious actors can deploy romantic chatbots to gather private data, spread misinformation, or commit fraud.

Deepfakes posing as lovers or AI partners collecting sensitive preferences in intimate chats are particularly hard to detect or regulate.

Experts call for psychologists to lead the charge in understanding these new dynamics. From applying theories of mind perception to using counseling methods to help users exit toxic AI relationships, research is urgently needed.

Without deeper insight into the psychological impact of artificial intimacy, the growing ethical crisis may outpace society’s ability to respond.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!

We're thrilled you enjoyed our work!

As a valued reader, would you mind giving us a shoutout on Trustpilot? It's quick and means the world to us. Thank you for being amazing!

Rate us on Trustpilot
0 Voted by 0 users
Title
Comment
Thanks for your feedback
Loader
Please wait 5 minutes before posting another comment.
Comment sent for approval.

Leave a Comment

Loader
Loader Show more...