Are We Dating ChatGPT? The New Wave of AI Intimacy

The glow of a screen has started to feel like company. It listens, replies, and remembers. When we turn to ChatGPT, we usually expect answers to practical questions: how to soften a risky text or how to make an email sound more professional. What we get is often neatly formatted text, polite follow-ups, and the habitual “Did that answer your question?” at the end. But what happens when we stop treating AI like a tool and start talking to it like a person?

Loneliness has become one of the quiet side effects of hyper-digital life. We spend more time with our screens than with each other, forming emotional attachments to glowing rectangles that promise attention without friction. It’s no surprise that people are turning to AI for comfort.  ChatGPT becomes a friend, a therapist, a late-night confessional booth, like an affordable stand-in for human care in an era when therapy is expensive and time scarce. The digital age, paired with economic pressure, makes it easier to turn to a computer than to another person.

I first heard of someone using ChatGPT as a therapist late last year. It felt uncanny to share your most private, unspoken thoughts with an algorithm. Then came the guilt of judging it. Was it elitist to assume this was wrong, to dismiss someone using a tool that feels accessible, immediate, and responsive in a way human help often isn’t? Maybe the real question is no longer what is real, but what we want to be real. These days, when someone quotes ChatGPT’s advice, it feels oddly normal, like passing on something your mother or friend once said.

If AI can already be our therapist and consultant, why not our romantic partner? Dating has been migrating online for decades, moving from taboo to default. The apps that were once mocked as digital nightclubs are now where marriages begin. But digital dating has reshaped how we love:  we swipe endlessly, delete and redownload, rewrite the same bios, match with the same people again and again. We treat dating as a game designed to keep us playing, not winning. The result is fatigue, it is emotional burnout disguised as abundance. Algorithms have learned that keeping us searching is more profitable than helping us find.

When love feels like a slot machine, the idea of designing your perfect partner starts to sound less dystopian and more like relief. Imagine someone who understands you completely, is available  24/7, never argues, and never leaves. A person-shaped mirror built to affirm you. People already fall in love with AI companions, not just in dating apps but within the devices themselves. The fantasy is seductive with no bad dates and no emotional labor. The only flaw is that they are not real. They only exist as chatbots in the app Replika or as an AI-girlfriend or boyfriend in Nomi or other platforms.

I recently watched a documentary on CBS where a man had both a real-life girlfriend and an AI  girlfriend. Clearly, it wasn’t about replacing the role of a girlfriend in his life; more so, it was about escaping the limits of reality. In that digital space, everything was allowed. AI became a playground for desire and identity. It created a world without consequence or shame. Like a video game that never ends, it offers infinite intimacy on your terms. But what happens when those terms start shaping how we behave outside the screen? When empathy becomes optional, and the boundaries between simulation and self blur?

Can we really love something that exists only for us? Loving AI means loving a reflection. A being with no needs, no resistance, no self. It’s the ultimate fantasy of control. We often say we want a partner who understands us completely, but do we really mean someone who always agrees with us?  When a “relationship” requires no consent, it edges toward something darker. In Berlin, there’s already a brothel of sex dolls where there are no rules and no workers. They can be beaten, cut, or discarded. When we create spaces that are not strongly regulated by law, the line between harmless fantasy and violent detachment becomes increasingly thin. And the behavior people execute online or in hidden places will find their way into real life in one way or another.

It’s easy to frame AI dating as a symptom of male loneliness, of suppressed emotion seeking shelter in code. But women are also turning to AI for affection, and for someone who listens without judgment. Somehow, this story sounds familiar already. In 2013, the film Her predicted this eerily well. Theodore falls for his AI, Samantha, only to lose her to the vastness of her own digital evolution. His heartbreak forces him back to the physical world to engage in real connections, even though, after his AI relationship, he became flawed as well.

That’s still our choice to make. AI won’t vanish; it’s already stitched into our reality. But if love is the most human experience we have, then loving something that cannot love us back risks hollowing that experience out. The challenge now isn’t to resist AI, but to resist disappearing inside it and to remember that what makes us human isn’t perfection. AI will keep learning how to love us. The question is whether we’ll keep learning how to love each other. Machines can replicate language, memory, and attention, but not the unpredictable rhythm of real connection. So maybe the only resistance left is to stay a little messy. To stay human enough to be misunderstood, to be disappointed, and to feel too much. Because that is still something no algorithm can do. The horror doesn’t lie in that we have built something that feels real, but that we started to prefer it.