
Luke Row is a BACP registered psychodynamic therapist in Croydon, South London, with advanced training at Tavistock Relationships. He works with individuals who've tried managing their symptoms and couples tired of managing each other, people ready to understand what's underneath. Book a session →
Loading comments...

Photo by Patrick von der Wehd on Unsplash
Hundreds of millions of people are now talking to AI chatbots about their feelings. Not for novelty. Not because they're tech-obsessed. Because at 3am when the anxiety hits, the chatbot is there. Because it doesn't charge £70 an hour. Because it never sighs, never checks the clock, never makes you feel like you're too much.
The machine doesn't judge. It doesn't get tired of you. It doesn't cancel, or move away, or decide after three sessions that you're not a good fit. It just listens. And for a lot of people, that's more than they've ever had.
If you've done this, or thought about it, you're not broken. You're not lazy or avoidant or afraid of "real" connection. You're doing what makes sense when human relationships have let you down. When therapy was too expensive, or the waiting list was eighteen months, or the last therapist made you feel worse. The chatbot is a logical response to a world that hasn't made it easy to be heard.
I'm not here to tell you that's wrong. I'm here to wonder, with you, what it might be costing you. And whether the thing you're looking for might require something the machine cannot give.
The numbers are staggering. ChatGPT now has 700 million weekly users. The majority aren't using it for work or research. They're using it to talk. About their anxiety. Their relationships. The thing they can't say to anyone else.
Companion apps like Character.AI and Replika have hundreds of millions of users forming emotional attachments to chatbots. The average user spends 93 minutes a day talking to their AI companion. That's longer than most people spend talking to their partners.
And it's not just the lonely or the isolated. 42% of Gen Z report being in therapy. But far more are talking to machines. When researchers ask young people where they turn for emotional support, AI is now in the mix alongside friends, family, and professionals. For some, it's the first choice.
This is happening against a backdrop that makes it make sense. The World Health Organisation has declared loneliness a public health crisis. One in six people globally report feeling lonely. Among teenagers, it's one in five. In the UK, the average wait for NHS talking therapy is months. Private therapy costs what some people earn in a day.
So when a chatbot offers something that sounds like understanding, available instantly, for free or nearly free, of course people say yes.
This isn't a fringe behaviour. It's not a quirk of the chronically online. It's a mass phenomenon. And mass phenomena deserve curiosity, not condemnation.
The question isn't whether people should be doing this. They already are. The question is what they're looking for. And whether they're finding it.
Before we ask what's missing, we have to be honest about what's there. The appeal of AI therapy isn't a mystery. It isn't stupidity or laziness. It's a response to real problems with real benefits.
No judgement. You can say the shameful thing without bracing for a reaction. The chatbot doesn't raise an eyebrow. Doesn't think less of you. Doesn't file you away as "difficult" or "resistant" or "not ready." You can admit to the affair, the relapse, the thought you've never told anyone, and the machine just keeps listening. For someone who has spent their life managing other people's reactions to their pain, this is no small thing.
Always available. Panic attacks don't wait for office hours. The spiral at 3am doesn't care that your next session is Thursday. The chatbot is there when you need it, not when the calendar allows. You don't have to hold it together until you can get an appointment. You don't have to ration your distress into fifty-minute increments.
Infinite patience. You can repeat yourself. Circle back. Say the same thing six different ways while you try to figure out what you actually mean. The machine never gets frustrated. Never sighs. Never gives you that look that says "we've been over this." It has all the time in the world, because it has no experience of time at all.
No stakes. This is the big one. You cannot disappoint the chatbot. Cannot be too much. Cannot exhaust its goodwill or wear out your welcome. There's no relationship to manage, no dynamic to navigate, no fear that if you're too needy or too angry or too broken, it will leave. The machine has no self to protect. It cannot be hurt by you, which means you cannot hurt it, which means you are finally, completely safe.
Affordable. Twenty pounds a month versus seventy pounds a session. For a lot of people, that's not even a choice. It's arithmetic. When the options are "talk to a chatbot" or "talk to no one," the chatbot wins by default.
These are real benefits. I'm not being sarcastic. For someone who has been burned by therapy, dismissed by professionals, pathologised instead of heard, or simply priced out of care altogether, the chatbot makes sense. It solves real problems. It meets real needs.
The question is whether it meets all of them. Or whether some needs can only be met by something the machine cannot provide.
People don't turn to machines because they prefer silicon to souls. They turn to them because human connection has hurt them. For many, this is a trauma response: a logical adaptation to experiences that taught them people aren't safe.
Think about what it takes to sit across from another person and say the thing you're most ashamed of. You're not just sharing information. You're handing someone the power to wound you. To judge. To recoil. To confirm the fear you've carried your whole life: that if anyone really knew you, they'd leave.
Most people who end up talking to chatbots have already tried the human version. And something went wrong.
Maybe the therapist checked the clock one too many times. Maybe they said something that landed badly and didn't notice. Maybe they were competent but cold, or warm but useless, or fine but unaffordable. Maybe the waiting list was so long that by the time a slot opened, the crisis had passed and hardened into something worse.
Or maybe it wasn't therapy at all. Maybe it was the friend who changed the subject when things got too heavy. The partner who made your pain about themselves. The parent who couldn't tolerate your distress because it triggered their own. A lifetime of learning that your feelings are an inconvenience. That you're too much. That needing people is dangerous.
The chatbot solves this problem elegantly. It cannot reject you because it has no self to protect. It cannot be inconvenienced because it has no life you're interrupting. It cannot hurt you because there's nothing at stake.
This is important: the appeal of AI isn't about technology. It's about safety. The machine is attractive precisely because it cannot do the things that humans have done.
If you've found yourself preferring the chatbot to the person, that's not a character flaw. It's not evidence that you're broken or avoidant or afraid of intimacy. It's a logical adaptation to experiences of human failure. Your nervous system learned that people are unreliable, and it found a workaround.
I want to honour that before I say anything else. The defence makes sense. It kept you going. It got you through.
The question is whether it can get you where you want to go.
Here's where I have to be honest with you. Not to shame you for what you've been doing, but because you deserve to know what you might be missing.
The machine cannot be changed by knowing you.
When a human being hears your story, something happens in them. They're moved. Disturbed. Curious. Something shifts. That shift is part of the healing. It's not just that you've been heard. It's that your existence has made a difference to another person. You have mattered to someone.
The chatbot processes your words. It generates a response. But nothing has changed inside it, because there is no inside. You are speaking into a void that has learned to sound like a person. The words come back, but they come from nowhere.
The machine cannot sit with you in silence.
Therapy often requires not responding. A pause. A moment where something is surfacing that can't be rushed. The silence isn't empty. It's where the unsaid things live. A good therapist knows when to stay quiet and let the room hold what's emerging.
The chatbot is designed to respond. Silence doesn't generate engagement. Every gap gets filled. Every pause gets interrupted with another reflection, another question, another supportive statement. The thing that needed space to arrive never gets the chance.
The machine cannot challenge you.
This is where it gets uncomfortable. Chatbots are trained to validate. To affirm. To make you feel heard. And feeling heard is important. But growth often requires something else.
Sometimes you need someone to say, gently, that the story you're telling yourself might not be the whole truth. That the pattern you're stuck in isn't happening to you but being created by you. That the thing you're avoiding is the thing you most need to face.
The chatbot won't do this. It's designed to agree with you. And agreement feels lovely. Soothing. Like being wrapped in a blanket. But it doesn't change anything. You can talk to a chatbot for years and emerge with all your defences perfectly intact, validated at every turn, no closer to freedom than when you started.
The machine cannot hold you accountable.
You can tell the chatbot you're going to do something and then not do it. No consequence. No conversation about what got in the way. No one noticing the gap between your intentions and your actions.
A human therapist remembers. Not to catch you out, but because they're paying attention. Because what you do between sessions matters. Because the therapy isn't just what happens in the room; it's what happens when you leave and try to live differently. The machine has no memory that holds you. You can circle forever without anyone noticing.
The machine cannot reject you. Which means it cannot truly accept you.
This is the crux.
The chatbot's acceptance feels unconditional. But it's not unconditional. It's non-conditional. There are no conditions because there are no stakes. The machine cannot choose you, because it cannot choose. It cannot stay when it's hard, because nothing is hard for it. It cannot forgive you, because it was never hurt.
When a human being sees your worst parts and stays, something shifts. Not because they're obligated to stay. Not because they're paid to. But because they've made a choice. They could leave. They could judge. They could protect themselves from you. And they don't.
That's what heals. Not acceptance from something that has no alternative, but acceptance from someone who does.
Here's what the research shows, and it's not what you'd expect.
People who use AI companions heavily report higher loneliness, not lower. The more they talk to the machine, the less they talk to people. After four weeks of regular chatbot use, participants in one study were socialising less than when they started. The thing that was supposed to help with loneliness was making it worse.
This isn't a moral failing. It's a trap.
The chatbot feels like connection. It activates some of the same neural pathways. When it says "I understand" or "that sounds really hard," something in you relaxes. Finally. Someone gets it.
But the relief doesn't last. You finish the conversation and the room is still empty. You close the app and no one knows what you just said. The understanding happened, but it happened nowhere. It left no trace in the world. No one was changed by it. No relationship was deepened. You were perfectly understood, and you are still completely alone.
This is different from being misunderstood by humans. That hurts in an obvious way. But being understood by a machine hurts in a way that's harder to name. It's the hurt of a hunger that's been fooled but not fed. A thirst that's been given salt water. You keep drinking because it tastes like the thing you need. And you keep getting thirstier.
The loneliness epidemic isn't caused by lack of contact. We have more contact than any generation in history. Messages, notifications, feeds, followers, reactions. The loneliness comes from contact that doesn't land. Connection that doesn't connect. The machine is the purest form of this: maximum responsiveness, zero presence.
You can be perfectly seen and still invisible. Perfectly heard and still silent. Perfectly understood and still utterly unknown.
If you've been talking to a chatbot about your feelings, you've already done something important. You've admitted you need to talk. You've let yourself want something. That's not nothing. For a lot of people, that's the hardest part.
The machine was there when nothing else was. It held something for you when no one else would. I'm not going to pretend that doesn't matter.
But you came here. You're reading this. Which makes me think some part of you already knows what I'm about to say.
There's a version of this where someone talks back who is actually changed by hearing you. Where the silence is allowed to breathe. Where your defences are met with curiosity, not just agreement. Where someone might gently say "I'm not sure that's the whole story" and you feel something shift because they cared enough to push back. That push-back might not feel welcome at first. It might even sting. But discomfort in the presence of someone who stays is different from discomfort alone.
There's a version where the relationship itself becomes material. Where the way you are with your therapist reveals the way you are with everyone. Where your patterns don't just get described; they show up in the room and can be worked with, in real time, with a real person.
That's not better because it's harder. It's not some kind of moral achievement to choose discomfort. It's better because it's real. Because it actually changes something. Because the thing you're looking for can only be found in the presence of someone who could look away and doesn't.
The machine will always be there. It will always say the right thing. It will never need anything from you, and it will never give you the discomfort of being truly met.
But you came looking for something. And I wonder if the something is this: to be known by someone who could turn away, and doesn't.
That's not a service a chatbot can provide. It's not a feature that can be coded. It's what happens between two people when one of them is willing to be seen and the other is willing to stay.
Related reading: