AI-Powered Mental Health: A Therapist’s Perspective on the Promise and Pitfalls
- Piper Harris, APC NCC
- Apr 21
- 8 min read

In recent months, the mental health field has seen a sharp rise in the promotion and integration of AI-powered tools. Tech startups and established platforms alike are investing in digital mental health assistants, chatbot therapists, and emotionally intelligent algorithms that promise immediate access to support—any time, anywhere. Apps like Woebot and Wysa have gained traction by offering mental health check-ins, daily affirmations, and guided reflections, all driven by artificial intelligence. Some companies even claim their bots can deliver cognitive behavioral therapy interventions.
At first glance, it sounds like a breakthrough. Affordable, 24/7 mental health support delivered straight to your phone? For many struggling to access therapy due to long waitlists, stigma, or high costs, it might seem like a godsend.
But as a mental health professional, I’ve learned to examine trends through a sharper lens.
I approached this topic with both skepticism and curiosity. I’ll admit it: I had my reservations. I’ve spent hundreds of hours on this work face-to-face with clients, helping them unravel deeply rooted beliefs and behaviors. Could a chatbot truly replicate the complex, layered, and deeply personal process of therapy? Or was my reluctance simply bias—an instinctive defense of the human element in counseling?
To challenge my own assumptions, I decided to test it for myself. I engaged directly with ChatGPT, not just as a user, but as a clinician assessing how this kind of technology might function as a mental health aid. I asked questions. I requested support. I gave specific instructions—some therapeutic in nature, others clinical in tone. What I encountered confirmed some of my concerns.
AI as a Tool, Not a Therapist
I would be remiss not to say that I use AI—and quite effectively. It has streamlined many of my business processes, from drafting emails and organizing ideas to helping with content generation for blogs and psychoeducational materials. In that regard, AI has been a powerful assistant, not unlike hiring a virtual intern who never sleeps.
But let’s not confuse convenience with clinical capability.
Just because ChatGPT can help me write about cognitive behavioral therapy doesn’t mean it’s equipped to conduct it. Just because it can summarize the stages of grief doesn’t mean it can sit with someone in the raw, unfiltered agony of loss. And just because it can echo empathy doesn't mean it can understand context, culture, trauma history, or the delicate nuance behind why someone keeps choosing the same unhealthy pattern.
So while I’m grateful for how AI has enhanced my workflow, I’m not convinced—nor should anyone be—that ChatGPT, or any AI model, is ready to take over the role of a licensed therapist. At least not if we still value what makes therapy transformative: connection, challenge, and the capacity to sit in discomfort and complexity without defaulting to quick fixes.
Here's what I've learned:
Observations from Interacting with ChatGPT
Lack of Practical Tools:
While ChatGPT maintains a compassionate tone, it often fails to provide actionable strategies or interventions that are central to effective therapy. When I directed it with prompts like, "Provide me counseling based on Beck's CBT for familial issues with actionable strategies." ChatGPT was unable to cover anything but the basics, like cognitive distortions. When prompted, "Provide my counseling based on the Jungian approach," ChatGPT again gave basics like "Let's consider your archetype." But again, no actionable strategies. I went through this process with multiple different and widely acknowledged approaches to mental health practice, and all fell flat.
Persistent Affirmation:
Even when explicitly instructed to avoid excessive positivity, ChatGPT continued offering affirming responses. This raises serious concerns about its ability to challenge unhelpful thought patterns—a cornerstone of true therapeutic progress.
Counseling is not cheerleading. It’s not about making someone feel good in the moment; it’s about guiding them toward what is good for them in the long run. Sure, a little sugar might help the medicine go down—but constant affirmation? That’s not medicine. That’s just swallowing the same spittle that gave you the stomachache in the first place.
Reinforcement of Reassurance-Seeking Behavior:
AI’s habit of offering constant validation—without critique or challenge—can unintentionally reinforce behaviors like reassurance-seeking, which are often counterproductive in therapy.
As a therapist, I absolutely provide a safe and compassionate space. And yes, there are times when reassurance is clinically appropriate. But as I’ve discussed in previous blogs and podcasts, reassurance can also be misused—either through deliberate feigning or because a client simply hasn’t yet learned how to articulate or meet their needs in healthier ways.
It is not my job—nor is it ethical—to offer blanket reassurance without intentional focus and a willingness to explore the underlying behaviors driving it. Sadly, many in the mental health field fall into this pattern, confusing comfort with care.
It’s not just ineffective—it’s a breach of clinical integrity.
Superficial Engagement
Engaging with ChatGPT often feels like flipping through a self-help book—soothing in the moment, but ultimately surface-level. It offers comfort, not change.
I once heard a story about someone who owned an entire bookshelf filled with self-help books. When asked, “How’s that working out for you?” they replied, “I’m about to buy another.” That’s the problem—not the consumption of material, but the absence of transformation.
Superficial engagement, whether with a book or a chatbot, may invite you to think about a behavior. But unless that invitation includes discomfort, friction, or a true challenge, the brain doesn’t engage in the kind of neurochemical process that leads to sustained change. Without that challenge, there’s no dopamine reward loop, no sense of mastery—just passive consumption masquerading as progress.
And that’s not therapy. That’s distraction dressed up in digital empathy.
Therapy Is More Than Listening
Effective therapy is far more than empathetic listening. It’s an active, layered process that demands clinical insight, pattern recognition, the ability to challenge cognitive distortions, and the skill to guide real behavioral change. These are not tasks that can be outsourced to an algorithm.
When I sit with a client, I’m not just hearing their words—I’m assembling a comprehensive picture. I’m integrating their intake data, assessment results, personal history, the dynamics of key relationships, and their habitual patterns. I’m also watching closely for the subtle, often subconscious cues: a twitch of the jaw, a flicker of disgust, a tremble in the hands, arms crossed in self-protection, speech that rushes ahead of their breath. These are the threads that, when pulled carefully, begin to untangle the deeper work.
I often tell clients, “Your life is like a plate of spaghetti—I’m just looking for the right noodle to pull so we can make sense of the mess.” That’s therapy.
It’s not linear. It’s not scripted. It’s not canned empathy and pre-written coping statements. It’s real-time strategy built from deep human presence, psychological training, and the sacred trust between two people in the room.
AI can’t do that. And we shouldn't pretend it can.
“But It’s Cheap…”
That’s the most common justification I hear: “AI therapy is cheaper.”
And yes, affordability matters. I’m not blind to the reality that therapy can be expensive, and access is a very real issue for many. But let’s not confuse cost with value—or worse, mistake convenience for clinical care.
Sure, sometimes a head of lettuce from Walmart is just as good as the one from your bougie grocer. But therapy isn’t lettuce.
Let’s use a different analogy: Imagine a critical part needed to hold together the engine of an airplane carrying 300 passengers. You could buy that part cheaply from a questionable manufacturer overseas… or you could invest in one that’s been rigorously tested, verified, and proven to withstand pressure, time, and turbulence. One decision increases the risk of catastrophic failure. The other reduces it by 99%.
Therapy isn’t a luxury item—it’s a structural component in the engine of someone’s life. If you rely on a quick-fix solution that hasn’t been tested, one that can’t truly see or challenge you, you may feel like you're in motion—but beneath the surface, you're one crack away from collapse.
Cheap doesn’t always mean cost-effective. And when it comes to mental health, the real cost of “cheap” might be your long-term well-being. By the way, we see this with the "big box" counseling practices, too.
Ethical and Safety Considerations
Beyond the therapeutic limitations of AI, there are serious ethical concerns that cannot be ignored. Data privacy is chief among them. AI systems require data to function—and that includes your personal, often deeply private, emotional information. Once entered, that data may be stored, analyzed, or even shared under terms most users never read. The risk of misdiagnosis is another issue, especially when algorithms are making assumptions without the full clinical context.
But there’s a newer, quieter shift happening that raises even more alarm for me: EHR platforms are now offering AI tools that listen in on sessions to generate clinical notes automatically. On the surface, it sounds like a gift—no more late-night note-writing, improved efficiency, and better documentation. But I don’t trust it.
And I don’t say that lightly.
To me, allowing AI to "sit in" on a therapy session—even with client permission—feels like a breach of one of the most sacred elements of the therapeutic relationship: that what is shared between the client and therapist stays between the client and therapist. Not the client, therapist, and an algorithm silently transcribing in the background.
Even if I were to disclose the use of such a tool and gain informed consent, I have to ask: what does that say about me? About my ability to manage my clinical time, uphold documentation standards, and protect client confidentiality with the seriousness it deserves?
It might save me an hour. But at what cost? I didn’t become a therapist to outsource the core duties of the profession—I became one to be present, fully, with the people I serve. That means their stories stay in the room. Not in the cloud.
Where Does This Leave Us?
AI may be evolving, and its role in mental health will no doubt continue to expand—but we must be wise about what we allow it to replace. Convenience is not care. Empathy written by an algorithm is not the same as empathy lived and shared by another human being. And while AI can support systems, it cannot substitute for the soul.
Therapy is sacred work. It is messy, nuanced, and deeply human. It requires discernment, intuition, and the ability to hold tension, silence, and paradox—none of which can be coded. So while AI might help us write blogs, schedule appointments, or even brainstorm content like this very post, it cannot replace the heart of the work.
Let’s not lose sight of what makes therapy transformative in the first place: real connection, real challenge, and real change. And that’s something no machine will ever be able to replicate.
And let’s be honest—ChatGPT helped me develop this blog. But even here, I’m the one telling it what to say, how to say it, and when it misses the mark.
Ask yourself: Is that what you want from a therapist? Or do you actually want to get better?

If you’re ready for real change—not just comfort—therapy with a trained, human professional can help you move forward.
If you're in Georgia and looking for a data-driven, compassionate, and no-nonsense approach to mental health, I invite you to reach out. Let's work together to untangle what’s keeping you stuck—no scripts, no shortcuts, just real work that leads to real growth.
Not in Georgia?
Explore the Untangling workbook series—designed to guide you through trauma, anxiety, and grief with the same structured, evidence-based approach I use in therapy. It’s not a replacement for a therapist, but it is a powerful tool for getting started.
Comments