
The Quiet Shift in How We Handle Relationships
In recent years, there has been a subtle but significant shift in how people approach their relationships. Most of us haven’t noticed it happening, but the way we seek emotional clarity has changed. Instead of turning to the person sitting across from us, many of us now turn to our phones. This isn’t because we no longer love our partners, but because a chat window offers something that human interaction often lacks: neutrality. It doesn’t sigh, doesn’t get defensive, and never makes us feel guilty for bringing up an issue at the wrong time.
A 2025 survey of 1,000 married Americans revealed that 64% of couples turn to AI tools for relationship advice before turning to each other. At first glance, this may seem unusual, but it’s becoming increasingly common. Think about the last time you typed something into ChatGPT that you probably should have said out loud. This behavior is so widespread that it deserves a closer look—not because using AI is inherently wrong, but because it signals something deeper when we rely on it reflexively and repeatedly.
The Comfort of a Witness Who Can’t Hurt You
The appeal of AI is clear once you understand what it offers. It’s fast, available 24/7, and free compared to therapy. More importantly, it provides a safe space to test your thoughts without consequences. You can describe your partner’s behavior in the harshest terms, work through your anger, and craft your argument with zero risk of escalation. For people who grew up in environments where conflict felt dangerous, this kind of safety is invaluable.
However, the problem arises when what feels like preparation becomes a substitute for real connection. Intimacy is built on reciprocity—the act of being vulnerable and then offering support in return. When we process conflicts through a chatbot first, we arrive at the actual conversation already resolved, already certain, and somewhat detached from the messy human being on the other side. We’ve rehearsed a verdict, not a dialogue.
MIT social scientist Sherry Turkle predicted this structural issue over a decade ago. In her book Alone Together, she described what she called the Goldilocks Effect: our tendency to create relationships that are not too close and not too far, mediated through technology to avoid the emotional risks of real contact. Asking an AI to navigate our most intimate conflicts is the ultimate realization of that dynamic.
When Avoidance Gets a Productivity Makeover
One reason this habit is hard to examine is that it disguises itself as self-improvement. Asking ChatGPT how to phrase a difficult conversation or whether your feelings are reasonable sounds like emotional maturity. A study found that most people reported more benefit than risk from using ChatGPT for mental health issues, noting that it’s particularly effective at helping people word difficult messages. That’s true—AI can be helpful. But the danger lies in when clarity-seeking becomes a routine way of avoiding the discomfort of being uncertain in front of someone else.
The same survey found that 33% of married respondents felt AI tools understood their relationship struggles better than their spouse does. An algorithm doesn’t truly understand anything, but it also doesn’t interrupt you, challenge your framing, or bring its own emotional needs to the table. When that feels preferable to talking to your actual partner, the issue isn’t the technology—it’s that the relationship has developed a significant avoidance pattern, and the AI is making it easier to maintain.
The study also found that 28% of respondents had made a financial decision based on AI advice without telling their spouse. Financial decisions made alone, conflicts processed alone, emotional labor outsourced to a machine that won’t remember the conversation tomorrow. Each of these actions seems minor on its own, but together, they paint a picture of a relationship where two people are increasingly solving the problem of each other rather than solving problems together.
What We Lose When We Optimize for Ease
Men make up roughly 85% of ChatGPT’s user base and are nearly three times more likely than women to use it for relationship advice. This pattern aligns with data showing that men are significantly less likely to seek therapy or discuss emotional struggles with friends. For a generation that was never taught that vulnerability is a strength, an AI that listens without judgment can feel like a miracle. What it actually is, however, is practice for a conversation that still has to happen with a real person who might push back.
Among Gen Z adults, 41% report having used AI to navigate their romantic lives. This number makes sense when you consider that this generation has always had the option to edit, delay, or avoid difficult conversations in real time. A text version of a vulnerable thing is almost always a smaller version of that thing, and real intimacy requires patience, vulnerability, and a willingness to be inconvenienced. Optimizing that process out of existence doesn’t make relationships easier—it makes them shallower.
Using a tool to get clearer on what you feel before a hard talk is reasonable. The red flag is when the chatbot becomes the relationship’s de facto therapist and confidant, when the AI knows more about what’s going wrong than your partner does. At that point, you’re not using technology to get closer to your person. You’re using it to maintain a comfortable distance, and calling it communication.






