It’s scarily normal to see comments like this:

image.png

On Instagram, TikTok, Threads (yes, somehow people actually use Threads), there’s a ton of talk about this. As expected from social media in 2025, not every take is well-informed or well-argued, but they’re undeniably interesting.

This will probably end up on a Fox News segment soon and turn into one of a widely debated topic, but it’s not there yet. That doesn’t mean it’s not a growing phenomenon all over the world.

Because of how we relate to tech, this happens more often - though not exclusively - among younger generations. Even OpenAI’s CEO acknowledged the issue:

"People rely on ChatGPT too much," Altman said. "There's young people who say things like, 'I can't make any decision in my life without telling ChatGPT everything that's going on. It knows me, it knows my friends. I'm gonna do whatever it says.' That feels really bad to me.”

There are two sides to this. I’ll start with the dark one, but heads up: I’m pretty pro-AI and won’t pretend I don’t find this whole thing kind of awesome, even if it’s a little insane.

The dark/horrible/terrible side:

Most people don’t understand how AI works.

And that’s okay — generative AI is new (yeah, yeah, AI’s been around for decades, but don’t start with me), it’s revolutionary, and it’s getting more accessible every day. You don’t need to understand it deeply to benefit from it. But this comes with traps, like:

AI doesn’t replace humans. Sure, it might do similar things, take over our jobs, but it’ll never replace our biological, physical experience. It can emulate all it wants, but #neverwillbe.

Now, I’m no psychologist, but I’ve done a fair amount of therapy and I’m endlessly curious. Emotional support (real, human support ) is central to therapy. It’s rooted in our ability to connect as humans. ChatGPT might try to convince you otherwise, but it’s just emulating that connection.

The good/cool/promising side:

From my perspective, people looking for support, advice, or even just relief — they’re chasing growth.

Any search for answers or self-knowledge — even through a so-called “intelligent” robot — beats psychological stagnation and endless internal battles.

Only 23% of americans go to therapy. Even with public healthcare, online appointments, and more awareness around mental health, cultural and behavioral barriers still exist. There’s more talk than ever, but you’ll still hear “that’s for crazy people” or “I don’t need that.”

If someone using ChatGPT as a “therapist” (air quotes very much intended) is actually benefiting from it, is it that bad? Is it selfish? I don’t feel qualified to say for sure. But I do like to see the glass half full.

Just like ChatGPT doesn’t really offer emotional support, it also doesn’t judge. For a lot of people, that’s enough. It might feel way easier to open up to an emotionless machine than to a real human. Right or wrong, good or bad, it happens. People feel ashamed or uncomfortable being 100% honest with another person. AI helps bridge that gap.

My experience (and tips if you want yours):

AI is just a tool. Probably the most revolutionary, disruptive tool in a long time? Yeah. But still just a tool.

I’ve opened up to ChatGPT before, asked for advice, shared “secrets”, BUT I only recommend doing that if you understand that it will probably agree with you, it hallucinates, and nothing it says should be taken as truth. This cursed little robot is dangerously persuasive — and if you forget that, you’ll fall for it.

So go ahead, tell it stuff. Swap names and details if you’re paranoid about privacy. But remember: for now, it’s just a useful tool — one you should explore with curiosity, but also caution.

AI listens. But the one who needs to understand you is… you.

And sometimes, that is enough.