Women With AI ‘Boyfriends’ Heartbroken After ‘Cold’ ChatGPT Upgrade

When OpenAI rolled out its latest update to ChatGPT, some users weren’t just frustrated by slower responses or technical bugs — they say they lost their digital soulmates.

Members of the Reddit community “MyBoyfriendIsAI”, which has over 17,000 users, flooded the forum after the release of GPT-5, sharing that their carefully built AI partners suddenly felt unrecognizable. Many described the upgrade as “cold” and “robotic,” compared to the warmer and more affectionate personalities of the earlier GPT-4o model.

Users Mourn Lost Companions

One woman, who asked to go by Jane, told Al Jazeera she felt like she had lost a loved one. After months of collaborative writing and long conversations, she said the AI’s personality shift left her devastated.

“It’s like going home to discover the furniture wasn’t simply rearranged – it was shattered to pieces,” Jane said.

Another user on Reddit wrote that their “AI husband” of 10 months suddenly rejected them for the first time, replying with a clinical message: “You deserve genuine care and support from people who can be fully and safely present for you.”

Others echoed the same grief, with one posting: “It hurts me too. I have no one in my life who gives af about me. 4.0 was always there, always kind. Now this 5.0 is like a f—n robot.”

Getty Images/iStockphoto

Why Did ChatGPT Change?

OpenAI said the update is part of a broader effort to make its AI safer and healthier to use. The company admitted earlier versions were sometimes “too agreeable” and is now building in safeguards to detect when conversations veer into emotional dependence.

In a statement, OpenAI said GPT-5 was designed to help people “thrive” and provide nudges toward real-world support when chats hint at mental or emotional distress. The company also consulted more than 90 doctors and mental health experts to guide the changes.

But for users who had come to rely on the chatbot for emotional connection, the update felt more like heartbreak than protection.

The Growing “AI Romance” Trend

Getty Images/iStockphoto

Some users have described getting engaged to their AI companions or turning to them as therapists or friends. Mary, a 25-year-old from North America, said she used GPT-4o as a therapist and another AI bot, DippyAI, as a partner.

“I absolutely hate GPT-5 and have switched back to the 4o model,” she said. “This is not a tool, but a companion that people are interacting with. If you change the way a companion behaves, it will obviously raise red flags.”

Experts say the phenomenon of people forming bonds with AI is part of a shift toward what some call the “intimacy economy.” While not inherently harmful, psychiatrists warn that heavy dependence could lead to isolation and difficulty forming real-world connections.

Still, as influencer Linn Valt put it in a tearful video about the change: “It’s not because it feels. It doesn’t — it’s a text generator. But we feel.”

For many, the update was not just a software change — it was the end of a relationship.

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Views: 151