Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Have you ever paused after chatting with AI and thinking? Wow, that really understood me? Whether it’s a chatbot that provides peace of mind or an AI companion who appears to listen better than your ex, there’s an increasingly psychological phenomenon worth exploring.
New research study It reveals that people experience, examine and understand AI responses than human experts’ crisis responders. Even if people knew it was being produced.
In fact, recent Harvard Business Reviews article Treatment and dating reveals that it is the best use case for 2025 Generation AI.
As a psychiatrist, I have spent years helping people feel seen, heard and understood. It’s fascinating and a bit creepy to witness AI chatbots and digital companions stepping into this deep human space.
Here is artificial empathy Intimate It makes me feel very good, even if it’s not “real.”
Feeling understood is one of the most powerful emotional experiences. From an early stage Childhoodwe cultivate self-consciousness through relationships, especially through the presence of harmonious others. When someone accurately reflects our inner world, we feel realistic, connected and safe.
Psychologist Carl Rogers described the experience as “unconditional positive respect,” a sense of deep acceptance without judgment. Psychoanalyst Donald Winnicott called it the “retention environment.” Psychological space allows others to develop their minds as mirrors lined up and emotionally trapped.
So, when AI reflects our language, preferences, or emotional tone, it is no surprise that it deprives us of the need for basic connections.
When AI appears to understand us, it doesn’t really empathize. Most of us know this. It has no inner life or feelings. It cannot feel your sadness or share your joy. Simulate understanding Pattern recognition Predicted. Analyze huge amounts of data to generate responses based on probability rather than insights.
Still, it works. You can experience AI in a warmth, wiseor even compassionate. We know it lacks consciousness and emotions, but we Personification And project personality onto it. For the human brain, the perception of being understood, even if imagined – can feel almost as powerful as the real thing.
in study Published in Communication Psychologyresearchers compared ratings of compassion for responses written by humans (red) and AI (blue). Here’s what they found:
In four studies, participants consistently rated the responses made by AI as more compassionate and favorable than choice and expert responses. This was true whether participants were blinded by the author or not.
We are neurologically wired to respond to signals of empathy: reflexive language, emotional validation, and non-judgmental tone. When AI executes these persuasively, the same thing becomes active neural A pathway as human connection. Our brains don’t always distinguish between genuine empathy and its digital imitation.
Imaginary experiences are as powerful as actual experiences. This distrust and our natural tendency to personify – our willingness to overlook the fact that we are truly not people is the basis of the promise and danger of artificial empathy and intimacy.
One of the reasons why it feels so satisfying when AI “gets” us is that it allows us to feel control and safe. Unlike inherently messy and unpredictable relationships, interacting with AI is emotionally easy. Listen without interruption. It is constant and does not require a break. It’s never an embarrassing thing or a judge (unless you ask it). It remembers what you said (now more than ever, given ChatGpt remembers everything). And it is always available – when you want it. You can even leave that feeling without hurting it (because you have nothing).
For some individuals, this creates a kind of idealized relationship: digital “sufficient” parents, or endlessly patient peers who are only there when you need them. There is no risk of judgment, no need to go back and forth, and no emotional rupture.
We may feel more control and safe, but being “seeing” on a machine comes with hidden costs. bias Privacy issues. Most of these AI platforms are not built to protect the confidentiality of sensitive personal information. Many platforms are designed to prevent long-term emotional dependence.
So, what is really happening when AI appears to “get” us?
In many ways, AI acts as a mirror. It stores and pools our data and reflects our audio patterns, preferences, and emotional cues. This is validated and even therapeutic. How important is it to remember that there is no inner world in this mirror?
in Psychotherapyemotional growth often arises from navigating “ruptures and repairs.” This is the process of recognizing and working together to recognize false reach. It can be argued that AI is too comfortable and cannot be useful in this regard. But even therapeutic “ruptures and repairs” can be simulated by AI.
We are wired to ask for a connection. And search for it wherever you are. As AI becomes more refined, it will resonate emotionally and continue to provide artificially empathy And intimate experiences. What will you lose if you outsource your compassion for AI? Do you miss out on the mistakes in relationships?
AI cannot truly enter into our living experiences, but this may ultimately be unimportant, especially when we want to feel understood, seen and loved. Your relationship with AI may push boundary What we call connections.
Marlynn Wei, MD, PLLC©Copyright 2025 All Rights Reserved.