Robot Clinician?: An AI Caution
- Cara Gruhala

- Mar 27
- 4 min read
As most people that interact with this practice know, we have a healthy and regular interaction with technology in many forms. We love hearing about client's favorite shows, the recent video game they are enjoying, how they can connect with loved ones near and far through written and video means, how they utilize technology to allow for more flexible working conditions, and we adore a good meme share moment! You're reading this using technology right now, and may have been directed here by a social media platform or browser search. Technology serves many, many helpful purposes.
I am not here to tell anyone how to feel about Arificial Intelligence (AI). This can be a highly polarizing topic. I am here today to voice concern regarding recent trends in utilization of AI for mental health purposes. I fully acknowledge that it can be very tempting to use AI for this purpose, because the idea of having 24 hour access to support, without travel, for free is tempting! Finances, time, transportation, schedule conflicts, geographic location, provider shortages, and more can be barriers to accessing therapy. The idea of being face-to-face with another human and sharing vulnerable things can certainly be scary. The thought of getting answers without this vulnerability can feel "safer" to some. Still, some significant concerns remain, that are hard to dismiss.

Confidentiality
One of the things that creates the safe container for therapy to be effective is the promise that the information you share will be protected, and not shared outside of your session. Licensed clinicians are bound by our professional codes of ethics, our licensing agencies, and by HIPAA to protect client information. Many AI platforms have no such protective requirements. There is often no guarantee of who is or is not seeing your information, and how it may be used or sold. At minimum, at this time the industry is still shifting enough, that we lack sufficient clarity to know that user information is and will continue to be protected.
It's Designed to Please You
AI, like social media, at it's core is designed to please you and keep you using it. AI has different motivations for you to keep using it than social media, but it's design to please the user is still present. While this might sound insignificant or even positive, this can cause significant problems when used for mental health support.
Clinicians are trained not to give advice. We are trained to help clients come to their own conclusions through exploration of emotions, physical cues, historical patterns, awareness of client goals, and more. AI often provides solutions and also may also do so in a brief interaction, without all pertinent details. There have also been instances where AI has been so intent on pleasing the user that it provides harmful or even dangerous advice. While many AI platforms are trained to catch very overtly harmful questions from users, they are still missing important points that lead to stigmatization of mental health conditions, or dangerous suggestions.

Humanity
AI is designed to mimic humans and human interaction. It learns patterns of speech and gathers information to make suggestions. While AI is pulling from massive amounts of human-created research and information, AI is not human. AI can try to support, but AI will never be able to provide genuine empathy. Licensed clinicians are not only trained in how to help, they have also experienced situations that cause disappointment, heartache, grief, anxiety, sadness, joy, pride, love, and more. Even when users think they are providing prompts, or asking questions of AI to give a therapist level response, the outcome is often very far from safe, ethical, and even helpful. Additionally while clinicians are specifically trained to do no harm, and to not abandon clients, AI has no such commitments.
While most clinicians operate from one or more therapeutic theories in their work, many studies have shown that a healthy therapeutic relationship, as perceived by the client, is the most impactful agent of change in therapy. The "Common Factors Model" shows that the therapeutic model accounts for around 15% of change, while the therapeutic relationship between client and clinician accounts for over 30% of change (Asay & Lambert, 1999).
Licensed clinicians are also trained to be highly attuned to obvious and to micro communications, expressions, and body language of their clients in order to most accurately treatment plan, diagnose, and communicate with their clients. Clinicians can hear if your words and the tone of your voice tell two different stories. They're trained to listen to the pause before you speak, hold your hopes and dreams, and remember important elements of your story. They provide genuine human interactions and facilitate genuine connection. Frequent users of AI have even reported to be more lonely than those who have similar human interactions but don't utilize AI.
Final Thoughts
In conclusion, after more than 10 years in this industry, it is more apparent to me than ever that therapy is part science and part art. The science certainly could be learned, but the most successful clinicians also work to grow in the art for the remainder of their careers. They work to gain skill, and strengthen intuition of when to use particular skills. They recognize human patterns at levels imperceptable to technology at this time. They show up and help you to change patterns, validate your human experience with their own, and help you understand how to transition your experiences in therapy to improve your life and relationships outside the therapy office. I often tell clients that ultimately my job is to work myself out of a job with them, and I can be on that journey with them for as long as they need me. While I can understand the pull towards accessible and safe technology that can benefit mental health, at this time I do not believe that AI is the best tool for the job. There are many ways for people to access quality human therapists, and I am always happy to help people find connection with providers, even if that is outside of my practice.



Comments