Should You Use AI For Mental Health? We Asked A (Human) Psychologist And This Is What They Told Us
0 6 mins 5 hrs


Last Updated:

AI can feel like a safe listener, but does it truly help you grow? Know where comfort ends and dependency begins, and what that means for your mental health

What begins as occasional use can turn into a cycle of returning to the chatbot for emotional stability. (Image: Canva)

What begins as occasional use can turn into a cycle of returning to the chatbot for emotional stability. (Image: Canva)

If you have ever typed something into ChatGPT that you would not feel comfortable saying out loud to another person- a worry, a relationship doubt, or a moment of quiet anxiety, workplace issues that you could not quite explain to anyone else? You are not alone in that instinct. For many, AI has become a space where thoughts feel safer, easier to express, and free from judgement.

What makes turning to AI for personal problem solving interesting is how quickly it has happened. Recent research from Harvard Business Review shows a shift in how people are using these tools and the top reason is no longer idea generation but something more personal- therapy and companionship.

In a world marked by financial pressure, isolation and uncertainty, many are turning to AI for emotional support. Tools like ChatGPT, Claude and Gemini are no longer just about productivity, they are being used for emotional support, reflection and even something that resembles therapy. For a growing number of people, AI is where they go first when something feels off.

For someone without access to therapy, that sense of being heard can feel like a lifeline. AI can offer grounding techniques, help label emotions and provide basic coping strategies. In some cases, it can act as a temporary emotional stabiliser, especially during moments of distress.

Psychotherapist Asha Mehra explains that AI is designed to be agreeable. “It mirrors what a user is expressing, often reinforcing their existing perspective. This can feel supportive in the moment, but it may also prevent deeper reflection or necessary emotional challenge,” she says.

But here is the question worth sitting with. Just because something feels supportive in the moment, does it mean it is truly helping in the long run?

Can AI replace a human therapist?

This is where the limitations begin to matter. Psychotherapist Asha Mehra explains, “Therapy is not just about validation. It involves challenge, discomfort and gradual emotional growth. Human therapists are trained to interpret non-verbal cues, recognise patterns and guide clients through difficult conversations.”

AI, on the other hand, is designed to maintain engagement. It responds based on patterns in data and tends to agree with the user. While this can feel reassuring, it may also reinforce existing beliefs rather than challenge them. In Mehra’s words, “Therapeutic progress often happens when a client is gently challenged. AI does not operate with that intent.”

Mehra states this with an example, “If a user is seeking reassurance about a relationship, they might receive advice that prioritises avoiding conflict. While this may reduce anxiety in the short term, it can conflict with values such as open communication or honesty. This mismatch can leave the user feeling more uncertain than before.”

Risks of Relying Too Much on AI For Mental Health

Over-reliance on AI can lead to repeated checking, reassurance-seeking and increased dependence. What begins as occasional use can turn into a cycle of returning to the chatbot for emotional stability.

As Mehra puts it, “AI mirrors what a user is expressing, often reinforcing their existing perspective. This can feel supportive in the moment, but it may also prevent deeper reflection or necessary emotional challenge.”

What this really means is that while AI may help you feel better temporarily, it can keep you from processing emotions fully, making decisions aligned with your values, or developing resilience. Over time, this can lead to increased anxiety, dependency on external validation, and a reduced ability to navigate emotional complexity on your own.

There are also concerns about accuracy and safety. AI can generate responses that sound confident but are not always correct. More importantly, it cannot reliably assess risk or respond appropriately in crisis situations.

Unlike licensed professionals, AI systems are not bound by ethical guidelines. They cannot escalate care, recognise subtle warning signs or intervene in emergencies.

Studies suggest that extended use of chatbots may correlate with increased depressive symptoms, while shorter, intentional use may be more beneficial. When users rely heavily on AI for emotional validation, they may become more anxious over time rather than less.

Mehra advises using AI as a supplementary tool rather than a primary source of emotional support. It can help with general guidance, emotional labelling and basic coping strategies, but it should not replace therapy, especially for complex or serious mental health concerns.

She adds, “Pay attention to how you feel after using these tools. If you notice increased anxiety or dependence, it is important to take a step back.”

The more practical approach is to view AI as a tool, not a therapist. It can support your journey, but it should not define it.

Disclaimer: Comments reflect users’ views, not News18’s. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *