Health

3 Surprising Reasons Why AI Has Become Our Confidante, By A Psychologist

By A Psychologis,Contributor,Mark Travers

Copyright forbes

3 Surprising Reasons Why AI Has Become Our Confidante, By A Psychologist

Our emotional ties to AI aren’t just fanciful. They follow patterns rooted in classic attachment theory, according to a new study.

From the early days of saying “Hey Siri” into our phones, to now saying “Please help me structure my day” on ChatGPT, we’ve come a long way. Within the last few years, it’s undeniable that artificial intelligence has become a dominant discourse in our lives.

Many people are turning to AI chatbots for comfort, reassurance and even companionship, and these interactions can feel deeply personal and very real. New research offers an in-depth explanation on why human-AI relationships can become so strong.

In a May 2025 study published in Current Psychology, researchers Fan Yang and Atsushi Oshio investigated why people feel attached to AI. Interestingly, they found that our connection to machines mirrors the very attachment patterns we rely on in human relationships.

The functions that AI serves mirrors the key roles that trusted humans play in our lives. But as comforting as this may feel, it also brings unique psychological risks.

Here are three key takeaways from the research on why we’re getting so attached to AI, along with actionable steps to use it mindfully.

1. It Can Feel Like A Safe Haven

When people say someone is their “safe haven,” they mean it’s someone they turn to for comfort and reassurance in times of distress and uncertainty.

MORE FOR YOU

A safe haven is a friend who listens without judgment when we’re upset or a partner who reassures us when we’re feeling lost. This is one of the most familiar and powerful ways we form emotional bonds. And AI has steadily entered the picture, ready to play these roles.

The Current Psychology study found that when people are stressed or overwhelmed, they may turn to AI for calm, steady reassurance. Unlike human partners, AI is unlikely to argue, criticize or withdraw from you. Its presence is consistent, and chatbots tend to offer non-judgmental responses, unless prompted otherwise.

After all, who doesn’t find it comforting to have something always available, always willing to listen to you when you need it. Think of someone venting to ChatGPT late at night after a breakup, or asking an AI for grounding exercises before a presentation. The interaction itself may not replace human comfort, but it provides a sense of relief; a pause button on spiraling thoughts.

That being said, the use of AI still has caveats and we must exercise caution with it. Here’s how to use it wisely when you’re temped to reach out for emotional support.

Be strategic. Briefly turn to AI for reflection, clarification or emotional support, but don’t let it replace human interaction entirely. AI may be helpful for short-term relief, structured advice or problem-solving, but it lacks genuine empathy and the nuanced, mutual emotional give-and-take that makes human relationships deeply fulfilling.

Balance it with human interaction. Even if you wish to use AI as a safe haven, remember to reconnect with friends, family or a therapist in real life. This ensures that your primary emotional needs are met through human connection and knowledge, as AI can also make errors and cannot be relied on for appropriate mental health advice.

Practice self-reflection. Journaling or recording your feelings after an AI interaction can help you separate genuine insights from comfort-seeking habits that might become excessive.

In short, you may want to use AI as a “first step” for emotional support, but it may not be in your best interest for it to become the final destination of your social ties.

“Users should treat AI as a supplement rather than a substitute for human relationships, using it for practical or short-term emotional support while keeping real social connections and professional mental-health resources at the center of their support network,” emphasizes Yang, lead author of the Current Psychology study, in our recent interview.

2. It’s The ‘Secure Base’ We All Crave

Have you ever noticed how confident you feel when you’re secure in who you are and what you do? Many of us crave the freedom of taking calculated and creative risks, and focusing on our growth. This mindset becomes easier to maintain when your environment is conducive to your development. And that’s where, for many chatbot users, AI has become an ally.

In attachment theory, a “secure base” offers comfort and safety in hard times and encouragement in everyday life. Just as children with secure caregivers feel safe to explore their environment more freely, adults with secure partners feel safer taking risks and pushing themselves outside of their comfort zone. According to Yang’s study, AI is similarly beginning to function as an “attachment figure” and seems to provide a secure base, too.

“When there is no apparent threat or stress, the existence of attachment figures can encourage people to explore and seek their growth,” Yang notes.

People often turn to AI for brainstorming or experimenting with ideas that they might feel hesitant to explore with humans. Because AI tends to be non-judgmental and endlessly available, it can encourage risk-taking in a safe, controlled environment.

This might look like:

Using AI to draft creative work, test ideas or plan projects without fear of criticism.

Exploring personal insights through guided prompts or exercises suggested by the AI.

Practicing conversations or difficult scenarios with AI before facing them in real life.

The reassurance that the AI will respond supportively can embolden people to take steps they might otherwise avoid. Here’s how to use it as a booster without creating a dependence on it.

Apply your learning offline. Real growth comes when you translate these digital explorations into real-life actions, conversations or creative projects. Use AI as a rehearsal space, not the stage itself.

Stay self-aware. Notice if you start relying on AI as the only source of encouragement or exploration. While it can support growth, a secure base is most effective when combined with human feedback and experiences.

Using AI as a secure base may expand your creativity, self-reflection and personal development, but only if it complements, rather than replaces, real-world experiences.

3. It’s A Constant Companion, Through Thick And Thin

Due to their constant availability, AI chatbots can be highly compelling. A human friend might not answer your midnight text, but AI will. A partner might dismiss your tenth worry about a work email, but AI will likely respond patiently.

One of the primary points of difference between AI and human attachment figures, then, is that it never gets tired, distracted or so busy that it forget you entirely. It’s always at your beck and call.

For people with high attachment anxiety, who tend to crave frequent reassurance, this availability can feel especially appealing. The constant proximity feels reassuring, especially for individuals who grapple with abandonment or rejection in human relationships.

This kind of accessibility, however, has a flip side. Over-reliance on AI for reassurance may reduce motivation to seek deeper, reciprocal human bonds. The danger is mistaking the comfort of a predictable script for the richness of a living relationship.

“You cannot exactly hug an AI the way you hug your friends, kiss your partner or play with your cat, even though physical touch is a vital part of how attachment forms. AI does not have its own life, and so it cannot share interesting things in its life with you. This makes the relationship between humans and AI inherently one-sided,” Yang explains.

Here’s what you can do to manage this double-edged sword:

Set clear boundaries. Schedule “AI-free” times or limit daily interactions to prevent over-dependence.

Diversify your support. Make sure your primary emotional needs are met by friends, family or mental-health professionals, rather than relying on AI for constant reassurance.

Reflect on your motives. Ask yourself, “Am I seeking insight, guidance or emotional comfort?” Understanding the difference can help you use AI effectively without creating dependency.

Additionally, one of the other prominent findings of the study is about the two key attachment patterns in human-AI relationships: attachment anxiety and attachment avoidance.

Attachment anxiety shows up when you worry about whether the AI will respond warmly enough, leading to repeated check-ins or requests for reassurance. You may feel uneasy with neutral responses, even knowing the AI can’t truly abandon you.

Attachment avoidance is when you may prefer to keep AI at arm’s length, using it strictly for factual tasks and avoiding emotional sharing.

These threats can feel very real. So, acknowledge your attachment style as you navigate AI use. If you’re more anxious, limit over-checking and diversify your sources of emotional support.

AI is becoming an emotional presence in our lives. Attachment theory helps us understand why these bonds matter so much to us, and how the key to healthy AI use really is balance and conscientious use.

How attached do you feel to AI chatbots or social robots? Take the science-backed Attitudes Towards Social Robots Scale to find out.

Editorial StandardsReprints & Permissions