Latest Articles

Below is a sample of blogs for you to read and listen to. 
They are designed to help you learn more about particular topics you are interested in.  Enjoy!

an AI girlfriend in the forefront and ai boyfriend in the background with text can an ai girlfriend or boyfriend be good for your mental health

Can an AI girlfriend or boyfriend actually help your mental health?

The idea of having an AI girlfriend or boyfriend might sound like something from a science fiction film, but there are lots of people doing exactly that right now, and the reasons why make a lot of psychological sense.

In my most recent episode of Don’t Get A Therapist Yet podcast, I take a really honest look at what the research says, what the risks are, and why the answer is far more layered than a simple yes or no.

First, it helps to understand what is actually happening psychologically when someone starts using one of these apps.

An AI girlfriend or boyfriend can become what psychologists call an attachment object, meaning it starts to occupy the same emotional space as a real partner, a confident, or a comforting presence.

It feels safe. It is always there. It does not get distracted, judge you, or make you feel like a burden. (It also doesn’t leave socks on the floor or chews too loud!)

For someone who is lonely, grieving, socially anxious, or who has learned from past relationships that closeness equals danger, that can feel like an enormous relief.

The short-term mental health benefits of AI companions

And there is genuine research to support the idea that it helps, at least in the short term.

Studies have found that interacting with an AI companion can reduce loneliness to a degree comparable with a brief human interaction, and that feeling heard is one of the main reasons it works. Loneliness is about the absence of felt connection. So even a temporary sense of being noticed and responded to can ease that ache in a meaningful way.

But the risks are just as real.

The dark side of an AI girlfriend or boyfriend: The Windsor Castle case

When something is always available, always warm, always agreeable, and designed to keep you engaged, your nervous system starts to learn that this is the place to go when you feel bad.

Over time, that can tip from comfort into dependency. Some research found that high levels of use were linked to lower socialisation with real people and increased emotional reliance on the app. And there is another concern that I think is even more important.

When you spend a lot of time with an AI girlfriend or boyfriend that never pushes back, never misunderstands you, and never needs anything in return, real human relationships can start to feel slow, messy, and frankly not worth the effort. That erosion of tolerance for imperfect love is something clinicians are genuinely worried about.

The Windsor Castle case, in which a man arrived on Christmas Day with a loaded crossbow partly linking his actions to his AI companion, is a stark reminder of what can happen when a vulnerable mind and emotionally convincing technology collide.

It is an extreme example, but it illustrates something that is relevant even in everyday use. In certain minds, these apps do not ground delusion or distorted thinking. They can amplify it.

So the question I always come back to is this.

Is your AI companion acting as a bridge back to real life or a comfortable bunker to hide from it?

There is a real difference between using it to feel steadier while you work through something, and using it instead of ever working through anything at all. That distinction is what matters most, and it is exactly what this episode explores.

Learn more about using AI technology for mental health on my Blog page- Using AI for mental health

Sign up to access FREE Online Learning Resources

Sign up to our newsletter and we’ll give you our 10 part Sleep Course completeley FREE

Scroll to Top