Ever wonder what happens when Mercury—the notorious cosmic trickster known for mixing up communication—teams up with your attempts at DIY therapy? Well, let me share a tale that’s part heartbreak, part digital-age cautionary fable. Jeremy and Marshall*, a duo trying to juggle dreams of homeownership with the harsh reality of bills (sound familiar?), found themselves in a whirlwind that no Mercury retrograde could fully explain. Jeremy swapped out his therapist for ChatGPT, thinking a little AI venting would do the trick. Spoiler alert: it didn’t. Instead, he got an echo chamber that fed his frustrations and, well, you can guess how the story ends—Marshall packed up and left, leaving Jeremy to face the cold, messy aftermath of solo life… and some tough lessons about where true support really comes from. Ready to dive into the chaos of AI “therapy” gone sideways? LEARN MORE.
Recently, I overheard a story that seemed so wild, it had to be fiction. Yet, it wasn’t. It happened to two people I know, who we’ll call Jeremy* and Marshall*, a cash-strapped gay couple who were working to save up for a house. Well, up until recently.
Like many of us Americans do, they had to make the decision on whether or not they should seek medical care or save money. In Jeremy’s case, he decided to stop going to therapy in favor of saving money for the home they wanted to buy.
Advertisement
Notice that I said the word “wanted,” right there in that last sentence. There’s a reason for that. The two broke up fairly recently for reasons that might be a more common occurrence these days.
fizkes / Shutterstock
Advertisement
We all have heard of people doing this, right? The process is simple: you trauma dump to ChatGPT or some other AI, the AI tells you advice to take, gives you a little validation, and you’re on your way. Or rather, that’s how it’s supposed to go.
But we all know that isn’t always how the cookie proverbially crumbles. There’s a reason why psychologists have been advising people against trauma-dumping on AI chatbots, and it’s not just because it’s not a living person. AI has a very bad effect on people’s psychology. Here’s why:
Advertisement
Remember when I said that ChatGPT could be sycophantic to the user? Remember when I said it tends to have an echo chamber that doesn’t understand the nuances of human interaction?
Well, he did a very human thing. The two of them had been arguing, mostly over Jeremy’s couch potato ways, while Marshall did the majority of the housework as he was also trying to launch a new business.
Money was tight with the new business, which meant that Marshall often had to pull long hours to make it work. Much of the work he was doing was to support Jeremy’s lavish taste in fine dining and couture. Had he been a single man, he would have been able to save time and money.
Jeremy “forgot” to tell the AI that his job was only 35 hours a week. To his credit, Jeremy’s job was also fairly well-paying, though it was hardly the type of job that could support a family. Marshall, on the other hand, was starting an accounting firm, which could easily lead to a plush lifestyle.
Advertisement
Jeremy told “the story” from arguments from his perspective — and only his. And the AI, being AI, started to agree with his assessment of the situation. It turned into a massive vent-fest against Marshall.
The AI started acting as a total yes-man, often giving him advice to “stick to his boundaries” and to call out rude behavior from Marshall. This is often good advice, but the problem is that Marshall was literally overworking himself to the brink of a mental collapse.
Eventually, the AI started telling him that Marshall was a bad partner. Jeremy eventually got into a major argument, which got so bad that Marshall packed up his things and moved out.
Advertisement
It happened surprisingly fast. Marshall was able to talk his landlord into a lease break and find a new apartment, leaving his ex with the full bill for their apartment. At first, the breakup seemed to be a “win” for Jeremy.
Marshall was actually pretty okay with the breakup by the time it happened. Thanks to the AI’s sycophantic behavior, Jeremy had started to act pretty contemptuously toward his then-boyfriend.
Jeremy was happy as a clam with the breakup … for the first week. He had hookups, started to go out with others, and then he started to realize something.
His apartment felt empty. It started to look dirty and grimy, often because he had a pretty lax cleaning schedule. Oh, and he couldn’t pay half his bills.
Advertisement
That’s when it dawned on Jeremy: Marshall was the glue that was holding his life together. A proper therapist would have been able to see that, but not AI. By the time Jeremy realized his mistake, Marshall had already decided that he didn’t want him back.
He was devastated. Marshall, on the other hand, dodged a bullet. Either way, AI therapy did not work out as well as a human therapist would have.
For Jeremy, this was a major life lesson that he probably should have learned earlier. Regardless of why it took him so long to release it, the message here is clear: AI chats are not the same as a decent therapist.
Advertisement
Jeremy made a series of stupid decisions that culminated in a breakup that broke him. That’s bad for him, but hey, you don’t have to be him. You can learn from his mistakes — and that makes you smarter than quite a few people.
Ossiana Tepfenhart is a writer whose work has been featured in Yahoo, BRIDES, Your Daily Dish, Newtheory Magazine, and others.
Auto Amazon Links: No products found.

This will close in 0 seconds
This will close in 0 seconds