If you type “ChatGPT therapist” into Google, the first result you are likely to see is a Reddit thread boldly titled “ChatGPT therapy saved me”. As you scroll through the hundreds of comments left beneath it, you’ll find countless individuals sharing deeply personal stories, accounts of how speaking to an AI chatbot has salvaged crumbling marriages, soothed long-standing emotional wounds, and helped them navigate complex relationships or mental health struggles that they may not have had the language or ability to express elsewhere.
What becomes immediately striking about these testimonials is two things. Firstly, that people are willingly and openly talking about their use of Chat GPT as a safe space to talk on the internet (for everyone to see), and yet it appears that they are unable to speak to their loved ones about the struggles in their life. Secondly, it seems that there is an overwhelming sense that AI offers a rare and precious experience – being heard without being judged, being understood without facing the discomfort of human reaction, and being able to speak freely without fear of shame, rejection, or dismissal from another person.
This begs a pressing and rather unsettling question: if so many people are able to open up to a machine, an algorithm with no real consciousness or capacity for empathy, what does that reveal about the state of our emotional lives and our ability to connect meaningfully with each other?
Just within this past week, friends of mine have reported several men openly (on the tube) seeking relationship advice, not from their partners, friends, or even therapists, but from Chat GPT. One was trying to work through the emotional confusion of missing his ex-girlfriend; another wanted guidance on how to rescue a relationship that was beginning to fall apart. These were not abstract or theoretical questions; they were deeply human struggles being brought to a digital assistant, not because it’s perfect or even particularly insightful, but because it’s perceived as ‘safe’.
At the same time, as I scroll through social media, I’m constantly confronted with a strange and somewhat paradoxical form of overexposure. A public kind of vulnerability that seems to thrive in comment sections, particularly beneath posts about celebrities and viral relationship content. During the latest wave of online hate surrounding Justin Bieber, named as ‘Hailey Beiber’s biggest hater’, users responded not just with hot takes or jokes, but with startlingly raw admissions about their own emotional pain, failed relationships, and histories of emotional neglect or abuse. One comment on a seemingly innocuous video of a smiling young couple read, “Not me with my narcissistic husband,” and was met with hundreds of replies, likes, and shared sentiments, creating what felt almost like a support group hidden in plain sight.
These two trends, people quietly turning to AI for emotional support, and strangers loudly sharing intimate pain in public digital spaces, both suggest that something fundamental has shifted in how we relate to each other and to ourselves. We are simultaneously more exposed and more emotionally isolated than ever before, and while we may have endless platforms to express ourselves, the depth and quality of real human connection seem to be slipping through the cracks.
It’s worth asking, then: is this growing reliance on AI for emotional processing simply a product of convenience – of a tool that’s free, always available, and easy to access (particularly in the current state of access to mental health services) – or is it symptomatic of a deeper cultural problem, one that speaks to our dwindling capacity for emotional intimacy, and our collective discomfort with vulnerability in close, interpersonal relationships?
In exploring this idea further, I found myself returning to a phrase that has gained increasing attention in recent years: the male loneliness epidemic. Across various studies and surveys, a consistent pattern has emerged – men are reporting fewer close friendships, less emotional support, and greater feelings of isolation than their female counterparts. This isn’t just a modern inconvenience; it’s a public health concern with real consequences, from increased mental health struggles to rising rates of depression and suicide. And yet, the cultural stigma around emotional openness continues to linger, pushing many men further into silence, until even their most private fears and desires are directed toward something incapable of emotional reciprocity: a chatbot.
What’s happening here may be more than just a tech trend, it might be the emergence of a new kind of digital journaling, one where instead of quietly pouring our hearts into notebooks, we type our pain into chat windows, hoping for understanding in return. Unlike a traditional diary, AI responds; it mirrors us, offers structure, and sometimes even gives comfort, even if its empathy is only simulated. It’s not therapy, it’s not regulated, and it’s certainly not human, but it’s accessible, fast, and it doesn’t ask anything of us emotionally.
This raises a profound and sobering question: are we truly seeking healing, or are we simply looking for somewhere – anywhere – to put our pain, because we’ve forgotten how to bring it to one another?
In the end, the rise of AI as an emotional confidante doesn’t just reveal something about technology – it reveals something about us. About how far we’ve drifted from one another, about how hesitant we’ve become to reach out in real life, and about the quiet ache of a generation that feels more comfortable whispering their truths into the void than speaking them aloud to someone they love.